Businesses like banks which provide service have to worry about problem of 'Customer Churn' i.e. customers leaving and joining another service provider. It is important to understand which aspects of the service influence a customer's decision in this regard. Management can concentrate efforts on improvement of service, keeping in mind these priorities.
You as a Data scientist with the bank need to build a neural network based classifier that can determine whether a customer will leave the bank or not in the next 6 months.
CustomerId: Unique ID which is assigned to each customer
Surname: Last name of the customer
CreditScore: It defines the credit history of the customer.
Geography: A customer’s location
Gender: It defines the Gender of the customer
Age: Age of the customer
Tenure: Number of years for which the customer has been with the bank
NumOfProducts: refers to the number of products that a customer has purchased through the bank.
Balance: Account balance
HasCrCard: It is a categorical variable which decides whether the customer has credit card or not.
EstimatedSalary: Estimated salary
isActiveMember: Is is a categorical variable which decides whether the customer is active member of the bank or not ( Active member in the sense, using bank products regularly, making transactions etc )
Exited : whether or not the customer left the bank within six month. It can take two values 0=No ( Customer did not leave the bank ) 1=Yes ( Customer left the bank )
# importing libraries
import pandas as pd
import numpy as np
import seaborn as sns
import matplotlib.pyplot as plt
import time
import warnings
#importing models
from sklearn.metrics import recall_score, confusion_matrix
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import MinMaxScaler
#importing Neural Network
import tensorflow as tf
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Dropout, Activation, BatchNormalization
from tensorflow.keras import backend
from tensorflow.keras.optimizers import Adam, Adagrad, RMSprop, SGD
from google.colab import drive
drive.mount('/content/MyDrive')
Mounted at /content/MyDrive
bankDataDF = pd.read_csv("/content/MyDrive/MyDrive/Great Learning/Projects/BankChurners/Churn.csv")
bankDataDF.info()
<class 'pandas.core.frame.DataFrame'> RangeIndex: 10000 entries, 0 to 9999 Data columns (total 14 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 RowNumber 10000 non-null int64 1 CustomerId 10000 non-null int64 2 Surname 10000 non-null object 3 CreditScore 10000 non-null int64 4 Geography 10000 non-null object 5 Gender 10000 non-null object 6 Age 10000 non-null int64 7 Tenure 10000 non-null int64 8 Balance 10000 non-null float64 9 NumOfProducts 10000 non-null int64 10 HasCrCard 10000 non-null int64 11 IsActiveMember 10000 non-null int64 12 EstimatedSalary 10000 non-null float64 13 Exited 10000 non-null int64 dtypes: float64(2), int64(9), object(3) memory usage: 1.1+ MB
bankDataDF.describe().T
| count | mean | std | min | 25% | 50% | 75% | max | |
|---|---|---|---|---|---|---|---|---|
| RowNumber | 10000.0 | 5.000500e+03 | 2886.895680 | 1.00 | 2500.75 | 5.000500e+03 | 7.500250e+03 | 10000.00 |
| CustomerId | 10000.0 | 1.569094e+07 | 71936.186123 | 15565701.00 | 15628528.25 | 1.569074e+07 | 1.575323e+07 | 15815690.00 |
| CreditScore | 10000.0 | 6.505288e+02 | 96.653299 | 350.00 | 584.00 | 6.520000e+02 | 7.180000e+02 | 850.00 |
| Age | 10000.0 | 3.892180e+01 | 10.487806 | 18.00 | 32.00 | 3.700000e+01 | 4.400000e+01 | 92.00 |
| Tenure | 10000.0 | 5.012800e+00 | 2.892174 | 0.00 | 3.00 | 5.000000e+00 | 7.000000e+00 | 10.00 |
| Balance | 10000.0 | 7.648589e+04 | 62397.405202 | 0.00 | 0.00 | 9.719854e+04 | 1.276442e+05 | 250898.09 |
| NumOfProducts | 10000.0 | 1.530200e+00 | 0.581654 | 1.00 | 1.00 | 1.000000e+00 | 2.000000e+00 | 4.00 |
| HasCrCard | 10000.0 | 7.055000e-01 | 0.455840 | 0.00 | 0.00 | 1.000000e+00 | 1.000000e+00 | 1.00 |
| IsActiveMember | 10000.0 | 5.151000e-01 | 0.499797 | 0.00 | 0.00 | 1.000000e+00 | 1.000000e+00 | 1.00 |
| EstimatedSalary | 10000.0 | 1.000902e+05 | 57510.492818 | 11.58 | 51002.11 | 1.001939e+05 | 1.493882e+05 | 199992.48 |
| Exited | 10000.0 | 2.037000e-01 | 0.402769 | 0.00 | 0.00 | 0.000000e+00 | 0.000000e+00 | 1.00 |
bankDataDF.drop_duplicates()
bankDataDF.shape
(10000, 14)
bankDataDF.head()
| RowNumber | CustomerId | Surname | CreditScore | Geography | Gender | Age | Tenure | Balance | NumOfProducts | HasCrCard | IsActiveMember | EstimatedSalary | Exited | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 1 | 15634602 | Hargrave | 619 | France | Female | 42 | 2 | 0.00 | 1 | 1 | 1 | 101348.88 | 1 |
| 1 | 2 | 15647311 | Hill | 608 | Spain | Female | 41 | 1 | 83807.86 | 1 | 0 | 1 | 112542.58 | 0 |
| 2 | 3 | 15619304 | Onio | 502 | France | Female | 42 | 8 | 159660.80 | 3 | 1 | 0 | 113931.57 | 1 |
| 3 | 4 | 15701354 | Boni | 699 | France | Female | 39 | 1 | 0.00 | 2 | 0 | 0 | 93826.63 | 0 |
| 4 | 5 | 15737888 | Mitchell | 850 | Spain | Female | 43 | 2 | 125510.82 | 1 | 1 | 1 | 79084.10 | 0 |
bankDataDF.isna().sum()
RowNumber 0 CustomerId 0 Surname 0 CreditScore 0 Geography 0 Gender 0 Age 0 Tenure 0 Balance 0 NumOfProducts 0 HasCrCard 0 IsActiveMember 0 EstimatedSalary 0 Exited 0 dtype: int64
# function to plot a boxplot and a histogram along the same scale.
def histogram_boxplot(data, feature, figsize=(12, 5), kde=False, bins=None):
"""
Boxplot and histogram combined
data: dataframe
feature: dataframe column
figsize: size of figure (default (12,7))
kde: whether to the show density curve (default False)
bins: number of bins for histogram (default None)
"""
f2, (ax_box2, ax_hist2) = plt.subplots(
nrows=2, # Number of rows of the subplot grid= 2
sharex=True, # x-axis will be shared among all subplots
gridspec_kw={"height_ratios": (0.25, 0.75)},
figsize=figsize,
) # creating the 2 subplots
sns.boxplot(
data=data, x=feature, ax=ax_box2, showmeans=True, color="violet"
) # boxplot will be created and a triangle will indicate the mean value of the column
sns.histplot(
data=data, x=feature, kde=kde, ax=ax_hist2, bins=bins, palette="winter"
) if bins else sns.histplot(
data=data, x=feature, kde=kde, ax=ax_hist2
) # For histogram
ax_hist2.axvline(
data[feature].mean(), color="green", linestyle="--"
) # Add mean to the histogram
ax_hist2.axvline(
data[feature].median(), color="black", linestyle="-"
) # Add median to the histogram
# function to plot stacked bar chart
def stacked_barplot(data, predictor, target):
"""
Print the category counts and plot a stacked bar chart
data: dataframe
predictor: independent variable
target: target variable
"""
count = data[predictor].nunique()
sorter = data[target].value_counts().index[-1]
tab1 = pd.crosstab(data[predictor], data[target], margins=True).sort_values(
by=sorter, ascending=False
)
print(tab1)
print("-" * 120)
tab = pd.crosstab(data[predictor], data[target]).sort_values(
by=sorter, ascending=False
)
tab.plot(kind="bar", stacked=True, figsize=(count + 1, 5))
plt.legend(
loc="lower left", frameon=False,
)
plt.legend(loc="upper left", bbox_to_anchor=(1, 1))
plt.show()
### Function to plot distributions
def distribution_plot_wrt_target(data, predictor, target):
fig, axs = plt.subplots(2, 2, figsize=(12,10))
target_uniq = data[target].unique()
axs[0, 0].set_title(str(predictor) + " Distribution of target for target=" + str(target_uniq[0]))
sns.histplot(
data=data[data[target] == target_uniq[0]],
x=predictor,
kde=True,
ax=axs[0, 0],
color="teal",
)
axs[0, 1].set_title(str(predictor) + " Distribution of target for target=" + str(target_uniq[1]))
sns.histplot(
data=data[data[target] == target_uniq[1]],
x=predictor,
kde=True,
ax=axs[0, 1],
color="orange",
)
axs[1, 0].set_title("Boxplot of: " + str(predictor) + " w.r.t target")
sns.boxplot(data=data, x=target, y=predictor, ax=axs[1, 0], palette="gist_rainbow")
axs[1, 1].set_title("Boxplot (without outliers) of: " + str(predictor) + " w.r.t target")
sns.boxplot(
data=data,
x=target,
y=predictor,
ax=axs[1, 1],
showfliers=False,
palette="gist_rainbow",
)
plt.tight_layout()
plt.show()
bankDataDFImp = bankDataDF.drop(labels =['RowNumber', 'CustomerId', 'Surname'], axis=1)
results = bankDataDFImp.describe()
for features in results.columns:
histogram_boxplot(bankDataDFImp,features,kde=True)
plt.show()
25 percentile of the customers in the data have a 0 bank balance.
warnings.filterwarnings("ignore")
stacked_barplot_list = ['Geography', 'Gender', 'Age', 'Tenure','NumOfProducts', 'HasCrCard', 'IsActiveMember']
for feature in stacked_barplot_list:
stacked_barplot(bankDataDFImp,feature,'Exited')
Exited 0 1 All Geography All 7963 2037 10000 Germany 1695 814 2509 France 4204 810 5014 Spain 2064 413 2477 ------------------------------------------------------------------------------------------------------------------------
Exited 0 1 All Gender All 7963 2037 10000 Female 3404 1139 4543 Male 4559 898 5457 ------------------------------------------------------------------------------------------------------------------------
Exited 0 1 All Age All 7963 2037 10000 46 135 91 226 40 343 89 432 43 209 88 297 45 142 87 229 .. ... ... ... 79 4 0 4 78 5 0 5 77 10 0 10 76 11 0 11 75 9 0 9 [71 rows x 3 columns] ------------------------------------------------------------------------------------------------------------------------
Exited 0 1 All Tenure All 7963 2037 10000 1 803 232 1035 3 796 213 1009 9 771 213 984 5 803 209 1012 4 786 203 989 2 847 201 1048 8 828 197 1025 6 771 196 967 7 851 177 1028 10 389 101 490 0 318 95 413 ------------------------------------------------------------------------------------------------------------------------
Exited 0 1 All NumOfProducts All 7963 2037 10000 1 3675 1409 5084 2 4242 348 4590 3 46 220 266 4 0 60 60 ------------------------------------------------------------------------------------------------------------------------
Exited 0 1 All HasCrCard All 7963 2037 10000 1 5631 1424 7055 0 2332 613 2945 ------------------------------------------------------------------------------------------------------------------------
Exited 0 1 All IsActiveMember All 7963 2037 10000 0 3547 1302 4849 1 4416 735 5151 ------------------------------------------------------------------------------------------------------------------------
warnings.filterwarnings("ignore")
for feature in bankDataDFImp.columns:
distribution_plot_wrt_target(bankDataDFImp,feature,'Exited')
bankDataDFImp_heat = bankDataDFImp.copy()
replace_dict = {'France':1, 'Germany': 2, 'Spain': 3,'Female': 1, 'Male':0}
bankDataDFImp_heat.replace(to_replace =replace_dict, inplace=True)
bankDataDFImp_heat.head()
plt.figure(figsize=(10,10))
sns.heatmap(bankDataDFImp_heat.corr(),cbar=False, annot=True, cmap="Spectral", linewidth=.5)
plt.show()
sns.pairplot(data=bankDataDFImp)
<seaborn.axisgrid.PairGrid at 0x7dfcf142bb80>
bankDataDFImpWithDummies=pd.get_dummies(bankDataDFImp,drop_first=True,dtype=float)
bankDataDFImpWithDummies
| CreditScore | Age | Tenure | Balance | NumOfProducts | HasCrCard | IsActiveMember | EstimatedSalary | Exited | Geography_Germany | Geography_Spain | Gender_Male | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 619 | 42 | 2 | 0.00 | 1 | 1 | 1 | 101348.88 | 1 | 0.0 | 0.0 | 0.0 |
| 1 | 608 | 41 | 1 | 83807.86 | 1 | 0 | 1 | 112542.58 | 0 | 0.0 | 1.0 | 0.0 |
| 2 | 502 | 42 | 8 | 159660.80 | 3 | 1 | 0 | 113931.57 | 1 | 0.0 | 0.0 | 0.0 |
| 3 | 699 | 39 | 1 | 0.00 | 2 | 0 | 0 | 93826.63 | 0 | 0.0 | 0.0 | 0.0 |
| 4 | 850 | 43 | 2 | 125510.82 | 1 | 1 | 1 | 79084.10 | 0 | 0.0 | 1.0 | 0.0 |
| ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 9995 | 771 | 39 | 5 | 0.00 | 2 | 1 | 0 | 96270.64 | 0 | 0.0 | 0.0 | 1.0 |
| 9996 | 516 | 35 | 10 | 57369.61 | 1 | 1 | 1 | 101699.77 | 0 | 0.0 | 0.0 | 1.0 |
| 9997 | 709 | 36 | 7 | 0.00 | 1 | 0 | 1 | 42085.58 | 1 | 0.0 | 0.0 | 0.0 |
| 9998 | 772 | 42 | 3 | 75075.31 | 2 | 1 | 0 | 92888.52 | 1 | 1.0 | 0.0 | 1.0 |
| 9999 | 792 | 28 | 4 | 130142.79 | 1 | 1 | 0 | 38190.78 | 0 | 0.0 | 0.0 | 0.0 |
10000 rows × 12 columns
X=bankDataDFImpWithDummies.drop('Exited',axis=1)
Y=bankDataDFImpWithDummies['Exited']
X_train, X_test, y_train, y_test = train_test_split(X,Y,train_size=0.8,random_state=1,stratify=Y)
X_train, X_val, y_train, y_val = train_test_split(X_train,y_train,train_size=0.7,random_state=1,stratify=y_train)
print('Number of records in Training. :', X_train.shape[0])
print('Number of records in Validation :', X_val.shape[0])
print('Number of records in Testing. :', X_test.shape[0])
Number of records in Training. : 5600 Number of records in Validation : 2400 Number of records in Testing. : 2000
# Scaling the numerical columns
from sklearn.preprocessing import StandardScaler
scaler = StandardScaler()
scaler.fit(X_train[['CreditScore','Age','Tenure','Balance', 'EstimatedSalary']])
X_train[['CreditScore','Age','Tenure','Balance', 'EstimatedSalary']] = scaler.transform(X_train[['CreditScore','Age','Tenure','Balance', 'EstimatedSalary']])
X_val[['CreditScore','Age','Tenure','Balance', 'EstimatedSalary']] = scaler.transform(X_val[['CreditScore','Age','Tenure','Balance', 'EstimatedSalary']])
X_test[['CreditScore','Age','Tenure','Balance', 'EstimatedSalary']] = scaler.transform(X_test[['CreditScore','Age','Tenure','Balance', 'EstimatedSalary']])
print ('First five rows of training \n\n', X_train.head(5))
First five rows of training
CreditScore Age Tenure Balance NumOfProducts HasCrCard \
6202 0.399937 0.944916 1.038097 0.659632 1 0
3254 -1.026235 1.039890 -0.337435 0.732420 1 1
9270 -1.573967 -1.714336 -0.681319 1.617099 1 1
6242 0.482614 1.229836 1.038097 0.342417 1 1
601 1.144027 -1.239469 0.694214 0.772931 2 1
IsActiveMember EstimatedSalary Geography_Germany Geography_Spain \
6202 1 -0.863385 0.0 1.0
3254 0 0.288961 1.0 0.0
9270 1 0.920040 0.0 0.0
6242 1 0.910569 1.0 0.0
601 1 1.192718 0.0 1.0
Gender_Male
6202 1.0
3254 1.0
9270 0.0
6242 1.0
601 0.0
The problem statement at hand requires to build a model that will predict the customer churn.
There are four possible scenarios,
In this case it is very important to reduce False Negatives errors. The metric that will help in this case is Recall. High Recall number means that the model is performing well and will help identify the potential candidates who would churn
#Defining the columns of the dataframe which are nothing but the hyper parameters and the metrics.
columns = ["# hidden layers","# neurons - hidden layer","activation function - hidden layer ","# epochs","batch size","optimizer","learning rate, momentum, dropout","weight initializer","regularization","train loss","validation loss","train recall","validation recall","time (secs)", "model","history"]
#Creating a pandas dataframe.
results = pd.DataFrame(columns=columns)
i=0
# Define eoch and batch size.
epochsize = 100
batchsize = 16
def model_fit(activationitem_l1,activationitem_l2,optimizer_val, Xtrain_data,Ytrain_data,epochsize,i,dropoutval=0,learning_rte=1e-3,momentumval=0.0):
history =[]
# to clear the previous sessions
backend.clear_session()
# Fixing the seed for random number generators
np.random.seed(42)
import random
random.seed(42)
tf.random.set_seed(42)
# We will be adding the layers sequentially
model_1 = Sequential()
model_1.add(Dense(128, activation=activationitem_l1, kernel_initializer='he_uniform', input_dim=Xtrain_data.shape[1])) # First hidden layer with 64 neurons, the input shape tuple denotes number of independent variables
model_1.add(Dense(128, activation=activationitem_l2, kernel_initializer='he_uniform')) # Second hidden layer with 32 neurons
model_1.add(BatchNormalization())
model_1.add(Dense(64, activation=activationitem_l2, kernel_initializer='he_uniform')) # First hidden layer with 64 neurons, the input shape tuple denotes number of independent variables
model_1.add(BatchNormalization())
model_1.add(Dense(1, activation='sigmoid')) # Output layer with only one neuron and sigmoid as activation function will give the probability of people exiting bank
if optimizer_val =='Adam':
model_1.compile(loss = 'binary_crossentropy', optimizer=tf.keras.optimizers.Adam(learning_rate=learning_rte), metrics=[tf.keras.metrics.Recall()])
elif optimizer_val =='AdaGrad':
model_1.compile(loss = 'binary_crossentropy', optimizer=tf.keras.optimizers.Adagrad(learning_rate=learning_rte), metrics=[tf.keras.metrics.Recall()])
elif optimizer_val =='SGD':
model_1.compile(loss = 'binary_crossentropy', optimizer=tf.keras.optimizers.SGD(learning_rate=learning_rte), metrics=[tf.keras.metrics.Recall()])
elif optimizer_val =='RMS':
model_1.compile(loss = 'binary_crossentropy', optimizer=tf.keras.optimizers.RMSprop(learning_rate=learning_rte,momentum=momentumval), metrics=[tf.keras.metrics.Recall()])
elif optimizer_val =='SGD-Mom':
model_1.compile(loss = 'binary_crossentropy', optimizer=tf.keras.optimizers.SGD(learning_rate=learning_rte, momentum=momentumval), metrics=[tf.keras.metrics.Recall()])
model_1.summary()
start = time.time()
history = model_1.fit(Xtrain_data,
Ytrain_data,
#validation_split=0.1,
validation_data=(X_val,y_val),
epochs=epochsize,
batch_size=batchsize,
verbose=2)
end=time.time()
results.loc[i] = [2,[128,128,64],[activationitem_l1,activationitem_l2],
epochsize,batchsize,optimizer_val,[learning_rte, momentumval ,dropoutval],"he_uniform","-",
history.history["loss"][-1],
history.history["val_loss"][-1],
history.history["recall"][-1],
history.history["val_recall"][-1],
round(end-start,2),
model_1, history]
def model_fit_with_dropout(activationitem_l1,activationitem_l2,optimizer_val,Xtrain_data,Ytrain_data,epochsize,i,dropoutval=0,learning_rte=1e-3,momentumval=0.0):
history =[]
# to clear the previous sessions
backend.clear_session()
# Fixing the seed for random number generators
np.random.seed(42)
import random
random.seed(42)
tf.random.set_seed(42)
# We will be adding the layers sequentially
model_2 = Sequential()
model_2.add(Dense(128, activation=activationitem_l1, input_dim=Xtrain_data.shape[1],kernel_initializer='he_uniform')) # First hidden layer with 64 neurons the input shape tuple denotes number of independent variables
model_2.add(Dropout(dropoutval)) # defining dropout value for the layer
model_2.add(Dense(128, activation=activationitem_l2,kernel_initializer='he_uniform')) # Second hidden layer with 32 neurons
model_2.add(BatchNormalization())
model_2.add(Dropout(dropoutval)) # defining dropout value for the layer
model_2.add(Dense(64, activation=activationitem_l2,kernel_initializer='he_uniform')) # Second hidden layer with 32 neurons
model_2.add(BatchNormalization())
model_2.add(Dropout(dropoutval)) # defining dropout value for the layer
model_2.add(Dense(1, activation='sigmoid')) # Output layer with only one neuron and sigmoid as activation function will give the probability of people exiting bank
if optimizer_val =='Adam':
model_2.compile(loss = 'binary_crossentropy', optimizer=tf.keras.optimizers.Adam(learning_rate=learning_rte), metrics=[tf.keras.metrics.Recall()])
elif optimizer_val =='AdaGrad':
model_2.compile(loss = 'binary_crossentropy', optimizer= tf.keras.optimizers.Adagrad(learning_rate=learning_rte), metrics=[tf.keras.metrics.Recall()])
elif optimizer_val =='SGD':
model_2.compile(loss = 'binary_crossentropy', optimizer= tf.keras.optimizers.SGD(learning_rate=learning_rte), metrics=[tf.keras.metrics.Recall()])
elif optimizer_val =='RMS':
model_2.compile(loss = 'binary_crossentropy', optimizer=tf.keras.optimizers.RMSprop(learning_rate=learning_rte,momentum=momentumval), metrics=[tf.keras.metrics.Recall()])
elif optimizer_val =='SGD-Mom':
model_2.compile(loss = 'binary_crossentropy', optimizer= tf.keras.optimizers.SGD(learning_rate=learning_rte, momentum=momentumval), metrics=[tf.keras.metrics.Recall()])
model_2.summary()
start = time.time()
history = model_2.fit(Xtrain_data,
Ytrain_data,
#validation_split=0.1,
validation_data=(X_val,y_val),
epochs=epochsize,
batch_size=batchsize,
verbose=2)
end=time.time()
results.loc[i] = [2,[128,128,64],[activationitem_l1,activationitem_l2],
epochsize,batchsize,optimizer_val,[learning_rte, momentumval ,dropoutval],"he_uniform","-",
history.history["loss"][-1],
history.history["val_loss"][-1],
history.history["recall"][-1],
history.history["val_recall"][-1],
round(end-start,2),
model_2,
history]
def plot(history, name):
"""
Function to plot loss/accuracy
history: an object which stores the metrics and losses.
name: can be one of Loss or Accuracy
"""
fig, ax = plt.subplots() #Creating a subplot with figure and axes.
plt.plot(history.history[name]) #Plotting the train accuracy or train loss
plt.plot(history.history['val_'+name]) #Plotting the validation accuracy or validation loss
plt.title('Model ' + name.capitalize()) #Defining the title of the plot.
plt.ylabel(name.capitalize()) #Capitalizing the first letter.
plt.xlabel('Epoch') #Defining the label for the x-axis.
fig.legend(['Train', 'Validation'], loc="outside right upper") #Defining the legend, loc controls the position of the legend.
model_fit('relu','relu','AdaGrad',X_train,y_train,100,i)
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 128) 1536
dense_1 (Dense) (None, 128) 16512
batch_normalization (Batch (None, 128) 512
Normalization)
dense_2 (Dense) (None, 64) 8256
batch_normalization_1 (Bat (None, 64) 256
chNormalization)
dense_3 (Dense) (None, 1) 65
=================================================================
Total params: 27137 (106.00 KB)
Trainable params: 26753 (104.50 KB)
Non-trainable params: 384 (1.50 KB)
_________________________________________________________________
Epoch 1/100
350/350 - 3s - loss: 0.7596 - recall: 0.5118 - val_loss: 0.6288 - val_recall: 0.4949 - 3s/epoch - 8ms/step
Epoch 2/100
350/350 - 1s - loss: 0.6287 - recall: 0.5521 - val_loss: 0.6268 - val_recall: 0.5665 - 1s/epoch - 3ms/step
Epoch 3/100
350/350 - 1s - loss: 0.5806 - recall: 0.5495 - val_loss: 0.5784 - val_recall: 0.5358 - 954ms/epoch - 3ms/step
Epoch 4/100
350/350 - 1s - loss: 0.5492 - recall: 0.5232 - val_loss: 0.5595 - val_recall: 0.5337 - 1s/epoch - 3ms/step
Epoch 5/100
350/350 - 1s - loss: 0.5338 - recall: 0.5083 - val_loss: 0.5445 - val_recall: 0.5153 - 1s/epoch - 3ms/step
Epoch 6/100
350/350 - 1s - loss: 0.5192 - recall: 0.4943 - val_loss: 0.5313 - val_recall: 0.5112 - 1s/epoch - 3ms/step
Epoch 7/100
350/350 - 1s - loss: 0.5035 - recall: 0.4926 - val_loss: 0.5121 - val_recall: 0.4847 - 911ms/epoch - 3ms/step
Epoch 8/100
350/350 - 1s - loss: 0.4936 - recall: 0.4838 - val_loss: 0.5063 - val_recall: 0.4888 - 963ms/epoch - 3ms/step
Epoch 9/100
350/350 - 1s - loss: 0.4770 - recall: 0.4855 - val_loss: 0.5012 - val_recall: 0.4888 - 1s/epoch - 3ms/step
Epoch 10/100
350/350 - 2s - loss: 0.4751 - recall: 0.4724 - val_loss: 0.4888 - val_recall: 0.4663 - 2s/epoch - 5ms/step
Epoch 11/100
350/350 - 2s - loss: 0.4728 - recall: 0.4540 - val_loss: 0.4819 - val_recall: 0.4581 - 2s/epoch - 5ms/step
Epoch 12/100
350/350 - 2s - loss: 0.4564 - recall: 0.4715 - val_loss: 0.4803 - val_recall: 0.4642 - 2s/epoch - 5ms/step
Epoch 13/100
350/350 - 1s - loss: 0.4522 - recall: 0.4514 - val_loss: 0.4738 - val_recall: 0.4560 - 1s/epoch - 3ms/step
Epoch 14/100
350/350 - 1s - loss: 0.4497 - recall: 0.4347 - val_loss: 0.4684 - val_recall: 0.4499 - 1s/epoch - 3ms/step
Epoch 15/100
350/350 - 1s - loss: 0.4452 - recall: 0.4338 - val_loss: 0.4632 - val_recall: 0.4294 - 929ms/epoch - 3ms/step
Epoch 16/100
350/350 - 1s - loss: 0.4347 - recall: 0.4400 - val_loss: 0.4573 - val_recall: 0.4254 - 1s/epoch - 3ms/step
Epoch 17/100
350/350 - 1s - loss: 0.4366 - recall: 0.4259 - val_loss: 0.4570 - val_recall: 0.4254 - 944ms/epoch - 3ms/step
Epoch 18/100
350/350 - 1s - loss: 0.4275 - recall: 0.4286 - val_loss: 0.4528 - val_recall: 0.4315 - 1s/epoch - 3ms/step
Epoch 19/100
350/350 - 1s - loss: 0.4289 - recall: 0.4233 - val_loss: 0.4444 - val_recall: 0.4110 - 1s/epoch - 3ms/step
Epoch 20/100
350/350 - 1s - loss: 0.4260 - recall: 0.4216 - val_loss: 0.4452 - val_recall: 0.4233 - 941ms/epoch - 3ms/step
Epoch 21/100
350/350 - 1s - loss: 0.4231 - recall: 0.4154 - val_loss: 0.4440 - val_recall: 0.4213 - 952ms/epoch - 3ms/step
Epoch 22/100
350/350 - 1s - loss: 0.4180 - recall: 0.4338 - val_loss: 0.4433 - val_recall: 0.4254 - 1s/epoch - 3ms/step
Epoch 23/100
350/350 - 2s - loss: 0.4208 - recall: 0.4128 - val_loss: 0.4393 - val_recall: 0.4192 - 2s/epoch - 5ms/step
Epoch 24/100
350/350 - 2s - loss: 0.4097 - recall: 0.4286 - val_loss: 0.4357 - val_recall: 0.4151 - 2s/epoch - 5ms/step
Epoch 25/100
350/350 - 1s - loss: 0.4085 - recall: 0.4198 - val_loss: 0.4360 - val_recall: 0.4151 - 1s/epoch - 4ms/step
Epoch 26/100
350/350 - 1s - loss: 0.4065 - recall: 0.4303 - val_loss: 0.4345 - val_recall: 0.4151 - 1s/epoch - 3ms/step
Epoch 27/100
350/350 - 1s - loss: 0.4108 - recall: 0.4032 - val_loss: 0.4334 - val_recall: 0.4172 - 894ms/epoch - 3ms/step
Epoch 28/100
350/350 - 1s - loss: 0.4094 - recall: 0.4224 - val_loss: 0.4281 - val_recall: 0.4049 - 1000ms/epoch - 3ms/step
Epoch 29/100
350/350 - 1s - loss: 0.4027 - recall: 0.4119 - val_loss: 0.4294 - val_recall: 0.4131 - 1s/epoch - 3ms/step
Epoch 30/100
350/350 - 1s - loss: 0.4037 - recall: 0.4067 - val_loss: 0.4274 - val_recall: 0.4110 - 1s/epoch - 3ms/step
Epoch 31/100
350/350 - 2s - loss: 0.4048 - recall: 0.3996 - val_loss: 0.4257 - val_recall: 0.4090 - 2s/epoch - 5ms/step
Epoch 32/100
350/350 - 1s - loss: 0.3967 - recall: 0.4207 - val_loss: 0.4260 - val_recall: 0.4049 - 1s/epoch - 3ms/step
Epoch 33/100
350/350 - 1s - loss: 0.3972 - recall: 0.3909 - val_loss: 0.4250 - val_recall: 0.4110 - 943ms/epoch - 3ms/step
Epoch 34/100
350/350 - 1s - loss: 0.3917 - recall: 0.4119 - val_loss: 0.4228 - val_recall: 0.4070 - 956ms/epoch - 3ms/step
Epoch 35/100
350/350 - 2s - loss: 0.3998 - recall: 0.4093 - val_loss: 0.4212 - val_recall: 0.4049 - 2s/epoch - 5ms/step
Epoch 36/100
350/350 - 2s - loss: 0.3940 - recall: 0.4137 - val_loss: 0.4221 - val_recall: 0.4090 - 2s/epoch - 5ms/step
Epoch 37/100
350/350 - 1s - loss: 0.3911 - recall: 0.4137 - val_loss: 0.4231 - val_recall: 0.4131 - 1s/epoch - 4ms/step
Epoch 38/100
350/350 - 1s - loss: 0.3894 - recall: 0.4040 - val_loss: 0.4176 - val_recall: 0.4008 - 987ms/epoch - 3ms/step
Epoch 39/100
350/350 - 1s - loss: 0.3924 - recall: 0.4040 - val_loss: 0.4177 - val_recall: 0.4029 - 1s/epoch - 3ms/step
Epoch 40/100
350/350 - 1s - loss: 0.3907 - recall: 0.4049 - val_loss: 0.4175 - val_recall: 0.4070 - 917ms/epoch - 3ms/step
Epoch 41/100
350/350 - 1s - loss: 0.3849 - recall: 0.4242 - val_loss: 0.4174 - val_recall: 0.4192 - 1s/epoch - 3ms/step
Epoch 42/100
350/350 - 1s - loss: 0.3889 - recall: 0.3891 - val_loss: 0.4163 - val_recall: 0.4070 - 991ms/epoch - 3ms/step
Epoch 43/100
350/350 - 1s - loss: 0.3855 - recall: 0.4084 - val_loss: 0.4156 - val_recall: 0.3988 - 985ms/epoch - 3ms/step
Epoch 44/100
350/350 - 1s - loss: 0.3810 - recall: 0.3979 - val_loss: 0.4131 - val_recall: 0.3988 - 964ms/epoch - 3ms/step
Epoch 45/100
350/350 - 1s - loss: 0.3798 - recall: 0.4145 - val_loss: 0.4151 - val_recall: 0.4090 - 1s/epoch - 3ms/step
Epoch 46/100
350/350 - 1s - loss: 0.3802 - recall: 0.4172 - val_loss: 0.4124 - val_recall: 0.4131 - 934ms/epoch - 3ms/step
Epoch 47/100
350/350 - 1s - loss: 0.3773 - recall: 0.4286 - val_loss: 0.4134 - val_recall: 0.4131 - 1s/epoch - 4ms/step
Epoch 48/100
350/350 - 2s - loss: 0.3794 - recall: 0.4294 - val_loss: 0.4131 - val_recall: 0.4192 - 2s/epoch - 4ms/step
Epoch 49/100
350/350 - 2s - loss: 0.3776 - recall: 0.4154 - val_loss: 0.4122 - val_recall: 0.4110 - 2s/epoch - 5ms/step
Epoch 50/100
350/350 - 1s - loss: 0.3771 - recall: 0.4119 - val_loss: 0.4107 - val_recall: 0.4172 - 1s/epoch - 3ms/step
Epoch 51/100
350/350 - 1s - loss: 0.3782 - recall: 0.4198 - val_loss: 0.4093 - val_recall: 0.4131 - 935ms/epoch - 3ms/step
Epoch 52/100
350/350 - 1s - loss: 0.3735 - recall: 0.4189 - val_loss: 0.4098 - val_recall: 0.4151 - 911ms/epoch - 3ms/step
Epoch 53/100
350/350 - 1s - loss: 0.3840 - recall: 0.4005 - val_loss: 0.4085 - val_recall: 0.4110 - 937ms/epoch - 3ms/step
Epoch 54/100
350/350 - 1s - loss: 0.3707 - recall: 0.4294 - val_loss: 0.4082 - val_recall: 0.4110 - 1s/epoch - 3ms/step
Epoch 55/100
350/350 - 1s - loss: 0.3689 - recall: 0.4242 - val_loss: 0.4104 - val_recall: 0.4172 - 911ms/epoch - 3ms/step
Epoch 56/100
350/350 - 1s - loss: 0.3759 - recall: 0.4128 - val_loss: 0.4067 - val_recall: 0.4110 - 1s/epoch - 3ms/step
Epoch 57/100
350/350 - 1s - loss: 0.3726 - recall: 0.4137 - val_loss: 0.4067 - val_recall: 0.4131 - 1s/epoch - 3ms/step
Epoch 58/100
350/350 - 1s - loss: 0.3690 - recall: 0.4251 - val_loss: 0.4069 - val_recall: 0.4131 - 1s/epoch - 3ms/step
Epoch 59/100
350/350 - 1s - loss: 0.3739 - recall: 0.4058 - val_loss: 0.4060 - val_recall: 0.4110 - 968ms/epoch - 3ms/step
Epoch 60/100
350/350 - 1s - loss: 0.3668 - recall: 0.4058 - val_loss: 0.4070 - val_recall: 0.4151 - 1s/epoch - 4ms/step
Epoch 61/100
350/350 - 1s - loss: 0.3693 - recall: 0.4189 - val_loss: 0.4063 - val_recall: 0.4070 - 1s/epoch - 4ms/step
Epoch 62/100
350/350 - 2s - loss: 0.3668 - recall: 0.4435 - val_loss: 0.4048 - val_recall: 0.4070 - 2s/epoch - 5ms/step
Epoch 63/100
350/350 - 1s - loss: 0.3713 - recall: 0.3961 - val_loss: 0.4050 - val_recall: 0.4172 - 1s/epoch - 3ms/step
Epoch 64/100
350/350 - 1s - loss: 0.3668 - recall: 0.4154 - val_loss: 0.4070 - val_recall: 0.4254 - 1s/epoch - 3ms/step
Epoch 65/100
350/350 - 1s - loss: 0.3672 - recall: 0.4110 - val_loss: 0.4068 - val_recall: 0.4254 - 1s/epoch - 3ms/step
Epoch 66/100
350/350 - 1s - loss: 0.3683 - recall: 0.4224 - val_loss: 0.4047 - val_recall: 0.4070 - 917ms/epoch - 3ms/step
Epoch 67/100
350/350 - 1s - loss: 0.3640 - recall: 0.4163 - val_loss: 0.4029 - val_recall: 0.3906 - 971ms/epoch - 3ms/step
Epoch 68/100
350/350 - 1s - loss: 0.3615 - recall: 0.4321 - val_loss: 0.4034 - val_recall: 0.4070 - 978ms/epoch - 3ms/step
Epoch 69/100
350/350 - 1s - loss: 0.3682 - recall: 0.4154 - val_loss: 0.4039 - val_recall: 0.4213 - 1s/epoch - 3ms/step
Epoch 70/100
350/350 - 1s - loss: 0.3672 - recall: 0.4207 - val_loss: 0.4036 - val_recall: 0.4131 - 948ms/epoch - 3ms/step
Epoch 71/100
350/350 - 1s - loss: 0.3648 - recall: 0.4172 - val_loss: 0.4031 - val_recall: 0.3947 - 886ms/epoch - 3ms/step
Epoch 72/100
350/350 - 1s - loss: 0.3553 - recall: 0.4408 - val_loss: 0.4028 - val_recall: 0.4131 - 1s/epoch - 3ms/step
Epoch 73/100
350/350 - 1s - loss: 0.3560 - recall: 0.4435 - val_loss: 0.4018 - val_recall: 0.4131 - 1s/epoch - 4ms/step
Epoch 74/100
350/350 - 2s - loss: 0.3681 - recall: 0.4075 - val_loss: 0.4004 - val_recall: 0.4172 - 2s/epoch - 5ms/step
Epoch 75/100
350/350 - 2s - loss: 0.3609 - recall: 0.4224 - val_loss: 0.4009 - val_recall: 0.4008 - 2s/epoch - 5ms/step
Epoch 76/100
350/350 - 1s - loss: 0.3577 - recall: 0.4233 - val_loss: 0.4017 - val_recall: 0.4131 - 963ms/epoch - 3ms/step
Epoch 77/100
350/350 - 1s - loss: 0.3647 - recall: 0.4356 - val_loss: 0.4029 - val_recall: 0.4274 - 874ms/epoch - 2ms/step
Epoch 78/100
350/350 - 1s - loss: 0.3621 - recall: 0.4391 - val_loss: 0.4026 - val_recall: 0.4151 - 889ms/epoch - 3ms/step
Epoch 79/100
350/350 - 1s - loss: 0.3585 - recall: 0.4277 - val_loss: 0.4003 - val_recall: 0.4049 - 890ms/epoch - 3ms/step
Epoch 80/100
350/350 - 1s - loss: 0.3548 - recall: 0.4408 - val_loss: 0.4009 - val_recall: 0.4029 - 970ms/epoch - 3ms/step
Epoch 81/100
350/350 - 1s - loss: 0.3564 - recall: 0.4373 - val_loss: 0.4013 - val_recall: 0.4254 - 998ms/epoch - 3ms/step
Epoch 82/100
350/350 - 1s - loss: 0.3531 - recall: 0.4505 - val_loss: 0.4000 - val_recall: 0.4110 - 1s/epoch - 3ms/step
Epoch 83/100
350/350 - 1s - loss: 0.3546 - recall: 0.4382 - val_loss: 0.4005 - val_recall: 0.4070 - 1s/epoch - 3ms/step
Epoch 84/100
350/350 - 1s - loss: 0.3529 - recall: 0.4452 - val_loss: 0.3987 - val_recall: 0.4029 - 911ms/epoch - 3ms/step
Epoch 85/100
350/350 - 1s - loss: 0.3543 - recall: 0.4365 - val_loss: 0.4001 - val_recall: 0.4090 - 943ms/epoch - 3ms/step
Epoch 86/100
350/350 - 1s - loss: 0.3547 - recall: 0.4268 - val_loss: 0.4015 - val_recall: 0.4172 - 1s/epoch - 4ms/step
Epoch 87/100
350/350 - 1s - loss: 0.3521 - recall: 0.4373 - val_loss: 0.4008 - val_recall: 0.4233 - 1s/epoch - 4ms/step
Epoch 88/100
350/350 - 2s - loss: 0.3550 - recall: 0.4294 - val_loss: 0.4009 - val_recall: 0.4151 - 2s/epoch - 5ms/step
Epoch 89/100
350/350 - 1s - loss: 0.3557 - recall: 0.4321 - val_loss: 0.3997 - val_recall: 0.4233 - 1s/epoch - 3ms/step
Epoch 90/100
350/350 - 1s - loss: 0.3571 - recall: 0.4207 - val_loss: 0.3994 - val_recall: 0.4029 - 1s/epoch - 3ms/step
Epoch 91/100
350/350 - 1s - loss: 0.3550 - recall: 0.4303 - val_loss: 0.4013 - val_recall: 0.4274 - 1s/epoch - 3ms/step
Epoch 92/100
350/350 - 1s - loss: 0.3474 - recall: 0.4461 - val_loss: 0.3993 - val_recall: 0.4151 - 1s/epoch - 3ms/step
Epoch 93/100
350/350 - 1s - loss: 0.3458 - recall: 0.4435 - val_loss: 0.3990 - val_recall: 0.4335 - 944ms/epoch - 3ms/step
Epoch 94/100
350/350 - 1s - loss: 0.3489 - recall: 0.4443 - val_loss: 0.3995 - val_recall: 0.4070 - 995ms/epoch - 3ms/step
Epoch 95/100
350/350 - 1s - loss: 0.3491 - recall: 0.4382 - val_loss: 0.3982 - val_recall: 0.4172 - 993ms/epoch - 3ms/step
Epoch 96/100
350/350 - 1s - loss: 0.3482 - recall: 0.4514 - val_loss: 0.3999 - val_recall: 0.4192 - 1s/epoch - 3ms/step
Epoch 97/100
350/350 - 1s - loss: 0.3528 - recall: 0.4391 - val_loss: 0.3981 - val_recall: 0.4213 - 1s/epoch - 3ms/step
Epoch 98/100
350/350 - 1s - loss: 0.3495 - recall: 0.4365 - val_loss: 0.3972 - val_recall: 0.4151 - 944ms/epoch - 3ms/step
Epoch 99/100
350/350 - 2s - loss: 0.3505 - recall: 0.4408 - val_loss: 0.3986 - val_recall: 0.4192 - 2s/epoch - 5ms/step
Epoch 100/100
350/350 - 2s - loss: 0.3491 - recall: 0.4505 - val_loss: 0.3999 - val_recall: 0.4335 - 2s/epoch - 5ms/step
plot(results.iloc[i]['history'],'loss')
plot(results.iloc[i]['history'],'recall')
results.iloc[i]
# hidden layers 2 # neurons - hidden layer [128, 128, 64] activation function - hidden layer [relu, relu] # epochs 100 batch size 16 optimizer AdaGrad learning rate, momentum, dropout [0.001, 0.0, 0] weight initializer he_uniform regularization - train loss 0.34906 validation loss 0.39992 train recall 0.450482 validation recall 0.433538 time (secs) 117.99 model <keras.src.engine.sequential.Sequential object... history <keras.src.callbacks.History object at 0x7dfce... Name: 0, dtype: object
i+=1
model_fit('relu','relu','SGD',X_train,y_train,100,i)
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 128) 1536
dense_1 (Dense) (None, 128) 16512
batch_normalization (Batch (None, 128) 512
Normalization)
dense_2 (Dense) (None, 64) 8256
batch_normalization_1 (Bat (None, 64) 256
chNormalization)
dense_3 (Dense) (None, 1) 65
=================================================================
Total params: 27137 (106.00 KB)
Trainable params: 26753 (104.50 KB)
Non-trainable params: 384 (1.50 KB)
_________________________________________________________________
Epoch 1/100
350/350 - 2s - loss: 0.8221 - recall: 0.4768 - val_loss: 0.6729 - val_recall: 0.4642 - 2s/epoch - 7ms/step
Epoch 2/100
350/350 - 1s - loss: 0.6444 - recall: 0.5031 - val_loss: 0.6242 - val_recall: 0.4888 - 890ms/epoch - 3ms/step
Epoch 3/100
350/350 - 1s - loss: 0.5700 - recall: 0.4628 - val_loss: 0.5608 - val_recall: 0.4274 - 898ms/epoch - 3ms/step
Epoch 4/100
350/350 - 1s - loss: 0.5264 - recall: 0.4382 - val_loss: 0.5321 - val_recall: 0.4131 - 1s/epoch - 3ms/step
Epoch 5/100
350/350 - 1s - loss: 0.5032 - recall: 0.4067 - val_loss: 0.5112 - val_recall: 0.4008 - 881ms/epoch - 3ms/step
Epoch 6/100
350/350 - 1s - loss: 0.4858 - recall: 0.3865 - val_loss: 0.4946 - val_recall: 0.3988 - 995ms/epoch - 3ms/step
Epoch 7/100
350/350 - 1s - loss: 0.4662 - recall: 0.3812 - val_loss: 0.4770 - val_recall: 0.3620 - 1s/epoch - 3ms/step
Epoch 8/100
350/350 - 1s - loss: 0.4572 - recall: 0.3497 - val_loss: 0.4688 - val_recall: 0.3579 - 1s/epoch - 3ms/step
Epoch 9/100
350/350 - 1s - loss: 0.4410 - recall: 0.3506 - val_loss: 0.4626 - val_recall: 0.3579 - 1s/epoch - 3ms/step
Epoch 10/100
350/350 - 2s - loss: 0.4402 - recall: 0.3348 - val_loss: 0.4538 - val_recall: 0.3395 - 2s/epoch - 4ms/step
Epoch 11/100
350/350 - 2s - loss: 0.4394 - recall: 0.3129 - val_loss: 0.4489 - val_recall: 0.3374 - 2s/epoch - 5ms/step
Epoch 12/100
350/350 - 2s - loss: 0.4233 - recall: 0.3339 - val_loss: 0.4460 - val_recall: 0.3558 - 2s/epoch - 5ms/step
Epoch 13/100
350/350 - 1s - loss: 0.4210 - recall: 0.3243 - val_loss: 0.4407 - val_recall: 0.3436 - 980ms/epoch - 3ms/step
Epoch 14/100
350/350 - 1s - loss: 0.4198 - recall: 0.3120 - val_loss: 0.4378 - val_recall: 0.3354 - 982ms/epoch - 3ms/step
Epoch 15/100
350/350 - 1s - loss: 0.4145 - recall: 0.3216 - val_loss: 0.4347 - val_recall: 0.3333 - 1s/epoch - 3ms/step
Epoch 16/100
350/350 - 1s - loss: 0.4079 - recall: 0.3138 - val_loss: 0.4304 - val_recall: 0.3415 - 1s/epoch - 3ms/step
Epoch 17/100
350/350 - 1s - loss: 0.4096 - recall: 0.3260 - val_loss: 0.4298 - val_recall: 0.3517 - 945ms/epoch - 3ms/step
Epoch 18/100
350/350 - 1s - loss: 0.4004 - recall: 0.3383 - val_loss: 0.4280 - val_recall: 0.3558 - 988ms/epoch - 3ms/step
Epoch 19/100
350/350 - 1s - loss: 0.4035 - recall: 0.3252 - val_loss: 0.4257 - val_recall: 0.3395 - 901ms/epoch - 3ms/step
Epoch 20/100
350/350 - 1s - loss: 0.4002 - recall: 0.3462 - val_loss: 0.4227 - val_recall: 0.3436 - 987ms/epoch - 3ms/step
Epoch 21/100
350/350 - 1s - loss: 0.3996 - recall: 0.3173 - val_loss: 0.4232 - val_recall: 0.3599 - 989ms/epoch - 3ms/step
Epoch 22/100
350/350 - 2s - loss: 0.3943 - recall: 0.3523 - val_loss: 0.4223 - val_recall: 0.3620 - 2s/epoch - 7ms/step
Epoch 23/100
350/350 - 2s - loss: 0.3980 - recall: 0.3322 - val_loss: 0.4189 - val_recall: 0.3640 - 2s/epoch - 5ms/step
Epoch 24/100
350/350 - 1s - loss: 0.3844 - recall: 0.3558 - val_loss: 0.4177 - val_recall: 0.3599 - 1s/epoch - 4ms/step
Epoch 25/100
350/350 - 1s - loss: 0.3851 - recall: 0.3734 - val_loss: 0.4174 - val_recall: 0.3742 - 908ms/epoch - 3ms/step
Epoch 26/100
350/350 - 1s - loss: 0.3839 - recall: 0.3725 - val_loss: 0.4157 - val_recall: 0.3681 - 992ms/epoch - 3ms/step
Epoch 27/100
350/350 - 1s - loss: 0.3881 - recall: 0.3532 - val_loss: 0.4149 - val_recall: 0.3722 - 1s/epoch - 3ms/step
Epoch 28/100
350/350 - 1s - loss: 0.3842 - recall: 0.3681 - val_loss: 0.4133 - val_recall: 0.3681 - 972ms/epoch - 3ms/step
Epoch 29/100
350/350 - 1s - loss: 0.3815 - recall: 0.3567 - val_loss: 0.4131 - val_recall: 0.3865 - 878ms/epoch - 3ms/step
Epoch 30/100
350/350 - 1s - loss: 0.3812 - recall: 0.3690 - val_loss: 0.4114 - val_recall: 0.3804 - 976ms/epoch - 3ms/step
Epoch 31/100
350/350 - 1s - loss: 0.3815 - recall: 0.3637 - val_loss: 0.4093 - val_recall: 0.3783 - 989ms/epoch - 3ms/step
Epoch 32/100
350/350 - 1s - loss: 0.3743 - recall: 0.3786 - val_loss: 0.4112 - val_recall: 0.3824 - 996ms/epoch - 3ms/step
Epoch 33/100
350/350 - 1s - loss: 0.3748 - recall: 0.3628 - val_loss: 0.4092 - val_recall: 0.3845 - 1s/epoch - 3ms/step
Epoch 34/100
350/350 - 1s - loss: 0.3682 - recall: 0.3821 - val_loss: 0.4076 - val_recall: 0.3885 - 1s/epoch - 3ms/step
Epoch 35/100
350/350 - 2s - loss: 0.3773 - recall: 0.3769 - val_loss: 0.4070 - val_recall: 0.3926 - 2s/epoch - 5ms/step
Epoch 36/100
350/350 - 2s - loss: 0.3719 - recall: 0.3926 - val_loss: 0.4059 - val_recall: 0.3947 - 2s/epoch - 5ms/step
Epoch 37/100
350/350 - 1s - loss: 0.3692 - recall: 0.3900 - val_loss: 0.4066 - val_recall: 0.4008 - 1s/epoch - 4ms/step
Epoch 38/100
350/350 - 1s - loss: 0.3673 - recall: 0.3970 - val_loss: 0.4039 - val_recall: 0.3926 - 977ms/epoch - 3ms/step
Epoch 39/100
350/350 - 1s - loss: 0.3692 - recall: 0.3865 - val_loss: 0.4037 - val_recall: 0.3865 - 867ms/epoch - 2ms/step
Epoch 40/100
350/350 - 1s - loss: 0.3668 - recall: 0.3935 - val_loss: 0.4043 - val_recall: 0.3988 - 989ms/epoch - 3ms/step
Epoch 41/100
350/350 - 1s - loss: 0.3611 - recall: 0.4224 - val_loss: 0.4039 - val_recall: 0.4090 - 897ms/epoch - 3ms/step
Epoch 42/100
350/350 - 1s - loss: 0.3647 - recall: 0.3970 - val_loss: 0.4041 - val_recall: 0.4110 - 991ms/epoch - 3ms/step
Epoch 43/100
350/350 - 1s - loss: 0.3623 - recall: 0.4084 - val_loss: 0.4031 - val_recall: 0.4131 - 919ms/epoch - 3ms/step
Epoch 44/100
350/350 - 1s - loss: 0.3567 - recall: 0.3918 - val_loss: 0.3995 - val_recall: 0.3845 - 1s/epoch - 3ms/step
Epoch 45/100
350/350 - 1s - loss: 0.3511 - recall: 0.4338 - val_loss: 0.4034 - val_recall: 0.4131 - 1s/epoch - 3ms/step
Epoch 46/100
350/350 - 1s - loss: 0.3529 - recall: 0.4216 - val_loss: 0.4031 - val_recall: 0.4192 - 1s/epoch - 3ms/step
Epoch 47/100
350/350 - 1s - loss: 0.3514 - recall: 0.4251 - val_loss: 0.4016 - val_recall: 0.4192 - 921ms/epoch - 3ms/step
Epoch 48/100
350/350 - 2s - loss: 0.3548 - recall: 0.4382 - val_loss: 0.4018 - val_recall: 0.4294 - 2s/epoch - 6ms/step
Epoch 49/100
350/350 - 2s - loss: 0.3516 - recall: 0.4137 - val_loss: 0.4018 - val_recall: 0.4294 - 2s/epoch - 5ms/step
Epoch 50/100
350/350 - 1s - loss: 0.3493 - recall: 0.4514 - val_loss: 0.4002 - val_recall: 0.4233 - 1s/epoch - 4ms/step
Epoch 51/100
350/350 - 1s - loss: 0.3531 - recall: 0.4242 - val_loss: 0.3995 - val_recall: 0.4254 - 1s/epoch - 3ms/step
Epoch 52/100
350/350 - 1s - loss: 0.3484 - recall: 0.4452 - val_loss: 0.3989 - val_recall: 0.4376 - 916ms/epoch - 3ms/step
Epoch 53/100
350/350 - 1s - loss: 0.3566 - recall: 0.4408 - val_loss: 0.3996 - val_recall: 0.4294 - 1s/epoch - 3ms/step
Epoch 54/100
350/350 - 1s - loss: 0.3421 - recall: 0.4654 - val_loss: 0.3994 - val_recall: 0.4254 - 871ms/epoch - 2ms/step
Epoch 55/100
350/350 - 1s - loss: 0.3419 - recall: 0.4557 - val_loss: 0.4004 - val_recall: 0.4274 - 867ms/epoch - 2ms/step
Epoch 56/100
350/350 - 1s - loss: 0.3478 - recall: 0.4347 - val_loss: 0.3976 - val_recall: 0.4233 - 994ms/epoch - 3ms/step
Epoch 57/100
350/350 - 1s - loss: 0.3454 - recall: 0.4566 - val_loss: 0.3980 - val_recall: 0.4254 - 876ms/epoch - 3ms/step
Epoch 58/100
350/350 - 1s - loss: 0.3408 - recall: 0.4741 - val_loss: 0.3999 - val_recall: 0.4417 - 976ms/epoch - 3ms/step
Epoch 59/100
350/350 - 1s - loss: 0.3441 - recall: 0.4443 - val_loss: 0.3981 - val_recall: 0.4233 - 1s/epoch - 3ms/step
Epoch 60/100
350/350 - 1s - loss: 0.3372 - recall: 0.4566 - val_loss: 0.3992 - val_recall: 0.4356 - 1s/epoch - 3ms/step
Epoch 61/100
350/350 - 1s - loss: 0.3400 - recall: 0.4628 - val_loss: 0.4003 - val_recall: 0.4233 - 1s/epoch - 4ms/step
Epoch 62/100
350/350 - 1s - loss: 0.3367 - recall: 0.4777 - val_loss: 0.3987 - val_recall: 0.4335 - 1s/epoch - 4ms/step
Epoch 63/100
350/350 - 2s - loss: 0.3426 - recall: 0.4575 - val_loss: 0.3994 - val_recall: 0.4376 - 2s/epoch - 5ms/step
Epoch 64/100
350/350 - 1s - loss: 0.3393 - recall: 0.4610 - val_loss: 0.4032 - val_recall: 0.4417 - 1s/epoch - 3ms/step
Epoch 65/100
350/350 - 1s - loss: 0.3370 - recall: 0.4566 - val_loss: 0.3999 - val_recall: 0.4438 - 990ms/epoch - 3ms/step
Epoch 66/100
350/350 - 1s - loss: 0.3373 - recall: 0.4689 - val_loss: 0.4004 - val_recall: 0.4376 - 957ms/epoch - 3ms/step
Epoch 67/100
350/350 - 1s - loss: 0.3365 - recall: 0.4636 - val_loss: 0.3989 - val_recall: 0.4315 - 1s/epoch - 3ms/step
Epoch 68/100
350/350 - 1s - loss: 0.3292 - recall: 0.4847 - val_loss: 0.4006 - val_recall: 0.4519 - 965ms/epoch - 3ms/step
Epoch 69/100
350/350 - 1s - loss: 0.3390 - recall: 0.4689 - val_loss: 0.3996 - val_recall: 0.4438 - 898ms/epoch - 3ms/step
Epoch 70/100
350/350 - 1s - loss: 0.3351 - recall: 0.4663 - val_loss: 0.3995 - val_recall: 0.4417 - 1s/epoch - 3ms/step
Epoch 71/100
350/350 - 1s - loss: 0.3349 - recall: 0.4733 - val_loss: 0.4012 - val_recall: 0.4397 - 1s/epoch - 3ms/step
Epoch 72/100
350/350 - 1s - loss: 0.3248 - recall: 0.4899 - val_loss: 0.4003 - val_recall: 0.4397 - 903ms/epoch - 3ms/step
Epoch 73/100
350/350 - 1s - loss: 0.3235 - recall: 0.4908 - val_loss: 0.3994 - val_recall: 0.4438 - 1s/epoch - 3ms/step
Epoch 74/100
350/350 - 2s - loss: 0.3369 - recall: 0.4698 - val_loss: 0.3972 - val_recall: 0.4438 - 2s/epoch - 5ms/step
Epoch 75/100
350/350 - 2s - loss: 0.3276 - recall: 0.4785 - val_loss: 0.3995 - val_recall: 0.4397 - 2s/epoch - 5ms/step
Epoch 76/100
350/350 - 2s - loss: 0.3245 - recall: 0.4803 - val_loss: 0.3988 - val_recall: 0.4417 - 2s/epoch - 6ms/step
Epoch 77/100
350/350 - 1s - loss: 0.3313 - recall: 0.4890 - val_loss: 0.4014 - val_recall: 0.4560 - 1s/epoch - 3ms/step
Epoch 78/100
350/350 - 1s - loss: 0.3275 - recall: 0.4864 - val_loss: 0.4022 - val_recall: 0.4417 - 1s/epoch - 3ms/step
Epoch 79/100
350/350 - 1s - loss: 0.3257 - recall: 0.4943 - val_loss: 0.3985 - val_recall: 0.4438 - 910ms/epoch - 3ms/step
Epoch 80/100
350/350 - 1s - loss: 0.3202 - recall: 0.5022 - val_loss: 0.3996 - val_recall: 0.4356 - 889ms/epoch - 3ms/step
Epoch 81/100
350/350 - 1s - loss: 0.3200 - recall: 0.5118 - val_loss: 0.4002 - val_recall: 0.4560 - 895ms/epoch - 3ms/step
Epoch 82/100
350/350 - 1s - loss: 0.3166 - recall: 0.5092 - val_loss: 0.4019 - val_recall: 0.4397 - 1s/epoch - 3ms/step
Epoch 83/100
350/350 - 1s - loss: 0.3171 - recall: 0.5083 - val_loss: 0.4012 - val_recall: 0.4376 - 970ms/epoch - 3ms/step
Epoch 84/100
350/350 - 1s - loss: 0.3192 - recall: 0.5171 - val_loss: 0.3995 - val_recall: 0.4294 - 944ms/epoch - 3ms/step
Epoch 85/100
350/350 - 1s - loss: 0.3179 - recall: 0.5022 - val_loss: 0.4027 - val_recall: 0.4479 - 898ms/epoch - 3ms/step
Epoch 86/100
350/350 - 1s - loss: 0.3193 - recall: 0.5092 - val_loss: 0.4040 - val_recall: 0.4438 - 911ms/epoch - 3ms/step
Epoch 87/100
350/350 - 2s - loss: 0.3133 - recall: 0.5153 - val_loss: 0.4047 - val_recall: 0.4601 - 2s/epoch - 5ms/step
Epoch 88/100
350/350 - 2s - loss: 0.3178 - recall: 0.5215 - val_loss: 0.4059 - val_recall: 0.4540 - 2s/epoch - 5ms/step
Epoch 89/100
350/350 - 1s - loss: 0.3199 - recall: 0.5188 - val_loss: 0.4037 - val_recall: 0.4540 - 1s/epoch - 4ms/step
Epoch 90/100
350/350 - 1s - loss: 0.3141 - recall: 0.5153 - val_loss: 0.4052 - val_recall: 0.4397 - 1s/epoch - 3ms/step
Epoch 91/100
350/350 - 1s - loss: 0.3164 - recall: 0.5110 - val_loss: 0.4071 - val_recall: 0.4540 - 898ms/epoch - 3ms/step
Epoch 92/100
350/350 - 1s - loss: 0.3117 - recall: 0.5285 - val_loss: 0.4029 - val_recall: 0.4499 - 1000ms/epoch - 3ms/step
Epoch 93/100
350/350 - 1s - loss: 0.3076 - recall: 0.5329 - val_loss: 0.4057 - val_recall: 0.4683 - 1s/epoch - 3ms/step
Epoch 94/100
350/350 - 1s - loss: 0.3119 - recall: 0.5215 - val_loss: 0.4080 - val_recall: 0.4458 - 1s/epoch - 3ms/step
Epoch 95/100
350/350 - 1s - loss: 0.3060 - recall: 0.5434 - val_loss: 0.4046 - val_recall: 0.4438 - 903ms/epoch - 3ms/step
Epoch 96/100
350/350 - 1s - loss: 0.3083 - recall: 0.5329 - val_loss: 0.4104 - val_recall: 0.4540 - 984ms/epoch - 3ms/step
Epoch 97/100
350/350 - 1s - loss: 0.3152 - recall: 0.5110 - val_loss: 0.4053 - val_recall: 0.4397 - 1s/epoch - 3ms/step
Epoch 98/100
350/350 - 1s - loss: 0.3101 - recall: 0.5145 - val_loss: 0.4047 - val_recall: 0.4540 - 907ms/epoch - 3ms/step
Epoch 99/100
350/350 - 1s - loss: 0.3091 - recall: 0.5188 - val_loss: 0.4071 - val_recall: 0.4499 - 1s/epoch - 3ms/step
Epoch 100/100
350/350 - 1s - loss: 0.3073 - recall: 0.5548 - val_loss: 0.4102 - val_recall: 0.4683 - 1s/epoch - 4ms/step
plot(results.iloc[i]['history'],'loss')
plot(results.iloc[i]['history'],'recall')
results.iloc[i]
# hidden layers 2 # neurons - hidden layer [128, 128, 64] activation function - hidden layer [relu, relu] # epochs 100 batch size 16 optimizer SGD learning rate, momentum, dropout [0.001, 0.0, 0] weight initializer he_uniform regularization - train loss 0.307272 validation loss 0.410196 train recall 0.554776 validation recall 0.468303 time (secs) 115.66 model <keras.src.engine.sequential.Sequential object... history <keras.src.callbacks.History object at 0x7dfce... Name: 1, dtype: object
i+=1
model_fit('relu','relu','SGD-Mom',X_train,y_train,100,i,momentumval=0.5)
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 128) 1536
dense_1 (Dense) (None, 128) 16512
batch_normalization (Batch (None, 128) 512
Normalization)
dense_2 (Dense) (None, 64) 8256
batch_normalization_1 (Bat (None, 64) 256
chNormalization)
dense_3 (Dense) (None, 1) 65
=================================================================
Total params: 27137 (106.00 KB)
Trainable params: 26753 (104.50 KB)
Non-trainable params: 384 (1.50 KB)
_________________________________________________________________
Epoch 1/100
350/350 - 3s - loss: 0.7364 - recall: 0.4733 - val_loss: 0.5772 - val_recall: 0.4110 - 3s/epoch - 7ms/step
Epoch 2/100
350/350 - 1s - loss: 0.5568 - recall: 0.4470 - val_loss: 0.5338 - val_recall: 0.4376 - 1s/epoch - 3ms/step
Epoch 3/100
350/350 - 1s - loss: 0.4948 - recall: 0.3891 - val_loss: 0.4867 - val_recall: 0.3845 - 965ms/epoch - 3ms/step
Epoch 4/100
350/350 - 1s - loss: 0.4601 - recall: 0.3532 - val_loss: 0.4647 - val_recall: 0.3661 - 952ms/epoch - 3ms/step
Epoch 5/100
350/350 - 1s - loss: 0.4449 - recall: 0.3243 - val_loss: 0.4547 - val_recall: 0.3558 - 949ms/epoch - 3ms/step
Epoch 6/100
350/350 - 1s - loss: 0.4348 - recall: 0.3225 - val_loss: 0.4453 - val_recall: 0.3579 - 978ms/epoch - 3ms/step
Epoch 7/100
350/350 - 1s - loss: 0.4226 - recall: 0.3243 - val_loss: 0.4381 - val_recall: 0.3456 - 1s/epoch - 3ms/step
Epoch 8/100
350/350 - 1s - loss: 0.4154 - recall: 0.3339 - val_loss: 0.4325 - val_recall: 0.3436 - 977ms/epoch - 3ms/step
Epoch 9/100
350/350 - 1s - loss: 0.4068 - recall: 0.3295 - val_loss: 0.4279 - val_recall: 0.3558 - 946ms/epoch - 3ms/step
Epoch 10/100
350/350 - 1s - loss: 0.4067 - recall: 0.3146 - val_loss: 0.4240 - val_recall: 0.3517 - 1s/epoch - 4ms/step
Epoch 11/100
350/350 - 2s - loss: 0.4030 - recall: 0.3287 - val_loss: 0.4224 - val_recall: 0.3599 - 2s/epoch - 5ms/step
Epoch 12/100
350/350 - 2s - loss: 0.3907 - recall: 0.3523 - val_loss: 0.4192 - val_recall: 0.3579 - 2s/epoch - 6ms/step
Epoch 13/100
350/350 - 1s - loss: 0.3899 - recall: 0.3383 - val_loss: 0.4156 - val_recall: 0.3661 - 1s/epoch - 3ms/step
Epoch 14/100
350/350 - 1s - loss: 0.3856 - recall: 0.3365 - val_loss: 0.4168 - val_recall: 0.3681 - 1s/epoch - 3ms/step
Epoch 15/100
350/350 - 1s - loss: 0.3825 - recall: 0.3488 - val_loss: 0.4132 - val_recall: 0.3722 - 1s/epoch - 3ms/step
Epoch 16/100
350/350 - 1s - loss: 0.3746 - recall: 0.3751 - val_loss: 0.4109 - val_recall: 0.3885 - 1s/epoch - 3ms/step
Epoch 17/100
350/350 - 1s - loss: 0.3758 - recall: 0.3751 - val_loss: 0.4093 - val_recall: 0.3967 - 948ms/epoch - 3ms/step
Epoch 18/100
350/350 - 1s - loss: 0.3686 - recall: 0.3865 - val_loss: 0.4094 - val_recall: 0.3967 - 943ms/epoch - 3ms/step
Epoch 19/100
350/350 - 1s - loss: 0.3713 - recall: 0.3690 - val_loss: 0.4080 - val_recall: 0.3967 - 939ms/epoch - 3ms/step
Epoch 20/100
350/350 - 1s - loss: 0.3654 - recall: 0.3970 - val_loss: 0.4046 - val_recall: 0.3845 - 972ms/epoch - 3ms/step
Epoch 21/100
350/350 - 1s - loss: 0.3666 - recall: 0.3935 - val_loss: 0.4076 - val_recall: 0.3988 - 967ms/epoch - 3ms/step
Epoch 22/100
350/350 - 1s - loss: 0.3593 - recall: 0.4242 - val_loss: 0.4067 - val_recall: 0.3947 - 971ms/epoch - 3ms/step
Epoch 23/100
350/350 - 2s - loss: 0.3622 - recall: 0.4023 - val_loss: 0.4034 - val_recall: 0.4070 - 2s/epoch - 5ms/step
Epoch 24/100
350/350 - 2s - loss: 0.3467 - recall: 0.4321 - val_loss: 0.4030 - val_recall: 0.4070 - 2s/epoch - 5ms/step
Epoch 25/100
350/350 - 2s - loss: 0.3479 - recall: 0.4452 - val_loss: 0.4039 - val_recall: 0.4213 - 2s/epoch - 5ms/step
Epoch 26/100
350/350 - 1s - loss: 0.3493 - recall: 0.4505 - val_loss: 0.4035 - val_recall: 0.4213 - 1s/epoch - 3ms/step
Epoch 27/100
350/350 - 1s - loss: 0.3523 - recall: 0.4233 - val_loss: 0.4037 - val_recall: 0.4335 - 997ms/epoch - 3ms/step
Epoch 28/100
350/350 - 1s - loss: 0.3487 - recall: 0.4391 - val_loss: 0.4019 - val_recall: 0.4294 - 1000ms/epoch - 3ms/step
Epoch 29/100
350/350 - 1s - loss: 0.3446 - recall: 0.4461 - val_loss: 0.4028 - val_recall: 0.4417 - 938ms/epoch - 3ms/step
Epoch 30/100
350/350 - 1s - loss: 0.3466 - recall: 0.4479 - val_loss: 0.4045 - val_recall: 0.4397 - 1s/epoch - 3ms/step
Epoch 31/100
350/350 - 1s - loss: 0.3462 - recall: 0.4479 - val_loss: 0.3990 - val_recall: 0.4254 - 884ms/epoch - 3ms/step
Epoch 32/100
350/350 - 1s - loss: 0.3382 - recall: 0.4715 - val_loss: 0.4030 - val_recall: 0.4397 - 1s/epoch - 3ms/step
Epoch 33/100
350/350 - 1s - loss: 0.3387 - recall: 0.4610 - val_loss: 0.4032 - val_recall: 0.4376 - 959ms/epoch - 3ms/step
Epoch 34/100
350/350 - 1s - loss: 0.3307 - recall: 0.4689 - val_loss: 0.4036 - val_recall: 0.4479 - 943ms/epoch - 3ms/step
Epoch 35/100
350/350 - 1s - loss: 0.3412 - recall: 0.4557 - val_loss: 0.4001 - val_recall: 0.4458 - 916ms/epoch - 3ms/step
Epoch 36/100
350/350 - 2s - loss: 0.3357 - recall: 0.4803 - val_loss: 0.4008 - val_recall: 0.4499 - 2s/epoch - 5ms/step
Epoch 37/100
350/350 - 2s - loss: 0.3331 - recall: 0.4803 - val_loss: 0.3995 - val_recall: 0.4479 - 2s/epoch - 6ms/step
Epoch 38/100
350/350 - 1s - loss: 0.3296 - recall: 0.4882 - val_loss: 0.3992 - val_recall: 0.4581 - 1s/epoch - 4ms/step
Epoch 39/100
350/350 - 1s - loss: 0.3317 - recall: 0.4777 - val_loss: 0.4004 - val_recall: 0.4622 - 995ms/epoch - 3ms/step
Epoch 40/100
350/350 - 1s - loss: 0.3316 - recall: 0.4706 - val_loss: 0.4041 - val_recall: 0.4499 - 942ms/epoch - 3ms/step
Epoch 41/100
350/350 - 1s - loss: 0.3253 - recall: 0.5031 - val_loss: 0.4037 - val_recall: 0.4622 - 1s/epoch - 3ms/step
Epoch 42/100
350/350 - 1s - loss: 0.3277 - recall: 0.4978 - val_loss: 0.4032 - val_recall: 0.4601 - 919ms/epoch - 3ms/step
Epoch 43/100
350/350 - 1s - loss: 0.3243 - recall: 0.4978 - val_loss: 0.4025 - val_recall: 0.4479 - 1s/epoch - 3ms/step
Epoch 44/100
350/350 - 1s - loss: 0.3159 - recall: 0.4961 - val_loss: 0.4024 - val_recall: 0.4458 - 962ms/epoch - 3ms/step
Epoch 45/100
350/350 - 1s - loss: 0.3111 - recall: 0.5486 - val_loss: 0.4065 - val_recall: 0.4581 - 952ms/epoch - 3ms/step
Epoch 46/100
350/350 - 1s - loss: 0.3163 - recall: 0.5206 - val_loss: 0.4089 - val_recall: 0.4703 - 1s/epoch - 3ms/step
Epoch 47/100
350/350 - 1s - loss: 0.3162 - recall: 0.5206 - val_loss: 0.4078 - val_recall: 0.4683 - 1s/epoch - 3ms/step
Epoch 48/100
350/350 - 1s - loss: 0.3207 - recall: 0.5101 - val_loss: 0.4044 - val_recall: 0.4601 - 1s/epoch - 4ms/step
Epoch 49/100
350/350 - 2s - loss: 0.3098 - recall: 0.5285 - val_loss: 0.4087 - val_recall: 0.4622 - 2s/epoch - 4ms/step
Epoch 50/100
350/350 - 2s - loss: 0.3087 - recall: 0.5215 - val_loss: 0.4031 - val_recall: 0.4581 - 2s/epoch - 6ms/step
Epoch 51/100
350/350 - 1s - loss: 0.3152 - recall: 0.4978 - val_loss: 0.4025 - val_recall: 0.4703 - 1s/epoch - 3ms/step
Epoch 52/100
350/350 - 1s - loss: 0.3089 - recall: 0.5355 - val_loss: 0.4076 - val_recall: 0.4744 - 1s/epoch - 3ms/step
Epoch 53/100
350/350 - 1s - loss: 0.3133 - recall: 0.5180 - val_loss: 0.4067 - val_recall: 0.4581 - 934ms/epoch - 3ms/step
Epoch 54/100
350/350 - 1s - loss: 0.3070 - recall: 0.5381 - val_loss: 0.4067 - val_recall: 0.4601 - 973ms/epoch - 3ms/step
Epoch 55/100
350/350 - 1s - loss: 0.3020 - recall: 0.5399 - val_loss: 0.4092 - val_recall: 0.4560 - 980ms/epoch - 3ms/step
Epoch 56/100
350/350 - 1s - loss: 0.3074 - recall: 0.5188 - val_loss: 0.4075 - val_recall: 0.4417 - 1s/epoch - 3ms/step
Epoch 57/100
350/350 - 1s - loss: 0.3052 - recall: 0.5495 - val_loss: 0.4123 - val_recall: 0.4683 - 1s/epoch - 3ms/step
Epoch 58/100
350/350 - 1s - loss: 0.2999 - recall: 0.5574 - val_loss: 0.4115 - val_recall: 0.4724 - 927ms/epoch - 3ms/step
Epoch 59/100
350/350 - 1s - loss: 0.3040 - recall: 0.5250 - val_loss: 0.4103 - val_recall: 0.4458 - 959ms/epoch - 3ms/step
Epoch 60/100
350/350 - 1s - loss: 0.2933 - recall: 0.5504 - val_loss: 0.4138 - val_recall: 0.4826 - 948ms/epoch - 3ms/step
Epoch 61/100
350/350 - 2s - loss: 0.2995 - recall: 0.5469 - val_loss: 0.4145 - val_recall: 0.4356 - 2s/epoch - 5ms/step
Epoch 62/100
350/350 - 2s - loss: 0.2966 - recall: 0.5609 - val_loss: 0.4144 - val_recall: 0.4458 - 2s/epoch - 6ms/step
Epoch 63/100
350/350 - 1s - loss: 0.3037 - recall: 0.5355 - val_loss: 0.4162 - val_recall: 0.4438 - 1s/epoch - 4ms/step
Epoch 64/100
350/350 - 1s - loss: 0.3008 - recall: 0.5574 - val_loss: 0.4185 - val_recall: 0.4479 - 943ms/epoch - 3ms/step
Epoch 65/100
350/350 - 1s - loss: 0.2947 - recall: 0.5557 - val_loss: 0.4193 - val_recall: 0.4560 - 991ms/epoch - 3ms/step
Epoch 66/100
350/350 - 1s - loss: 0.2944 - recall: 0.5618 - val_loss: 0.4185 - val_recall: 0.4438 - 1s/epoch - 3ms/step
Epoch 67/100
350/350 - 1s - loss: 0.2945 - recall: 0.5425 - val_loss: 0.4196 - val_recall: 0.4540 - 1s/epoch - 3ms/step
Epoch 68/100
350/350 - 1s - loss: 0.2859 - recall: 0.6004 - val_loss: 0.4272 - val_recall: 0.4560 - 1s/epoch - 3ms/step
Epoch 69/100
350/350 - 1s - loss: 0.2977 - recall: 0.5495 - val_loss: 0.4250 - val_recall: 0.4397 - 965ms/epoch - 3ms/step
Epoch 70/100
350/350 - 1s - loss: 0.2936 - recall: 0.5662 - val_loss: 0.4229 - val_recall: 0.4581 - 963ms/epoch - 3ms/step
Epoch 71/100
350/350 - 1s - loss: 0.2935 - recall: 0.5486 - val_loss: 0.4270 - val_recall: 0.4397 - 983ms/epoch - 3ms/step
Epoch 72/100
350/350 - 1s - loss: 0.2818 - recall: 0.5776 - val_loss: 0.4300 - val_recall: 0.4519 - 1s/epoch - 3ms/step
Epoch 73/100
350/350 - 1s - loss: 0.2824 - recall: 0.5933 - val_loss: 0.4294 - val_recall: 0.4663 - 1s/epoch - 4ms/step
Epoch 74/100
350/350 - 1s - loss: 0.2918 - recall: 0.5600 - val_loss: 0.4266 - val_recall: 0.4622 - 1s/epoch - 4ms/step
Epoch 75/100
350/350 - 2s - loss: 0.2837 - recall: 0.5732 - val_loss: 0.4274 - val_recall: 0.4581 - 2s/epoch - 5ms/step
Epoch 76/100
350/350 - 1s - loss: 0.2807 - recall: 0.5925 - val_loss: 0.4227 - val_recall: 0.4724 - 1s/epoch - 4ms/step
Epoch 77/100
350/350 - 1s - loss: 0.2848 - recall: 0.5925 - val_loss: 0.4315 - val_recall: 0.4663 - 939ms/epoch - 3ms/step
Epoch 78/100
350/350 - 1s - loss: 0.2815 - recall: 0.5644 - val_loss: 0.4350 - val_recall: 0.4560 - 903ms/epoch - 3ms/step
Epoch 79/100
350/350 - 1s - loss: 0.2804 - recall: 0.6021 - val_loss: 0.4341 - val_recall: 0.4540 - 940ms/epoch - 3ms/step
Epoch 80/100
350/350 - 1s - loss: 0.2812 - recall: 0.5907 - val_loss: 0.4324 - val_recall: 0.4540 - 1s/epoch - 3ms/step
Epoch 81/100
350/350 - 1s - loss: 0.2773 - recall: 0.5968 - val_loss: 0.4318 - val_recall: 0.4642 - 1s/epoch - 3ms/step
Epoch 82/100
350/350 - 1s - loss: 0.2751 - recall: 0.5837 - val_loss: 0.4305 - val_recall: 0.4519 - 1s/epoch - 3ms/step
Epoch 83/100
350/350 - 1s - loss: 0.2722 - recall: 0.6056 - val_loss: 0.4414 - val_recall: 0.4601 - 902ms/epoch - 3ms/step
Epoch 84/100
350/350 - 1s - loss: 0.2734 - recall: 0.6109 - val_loss: 0.4311 - val_recall: 0.4581 - 1s/epoch - 3ms/step
Epoch 85/100
350/350 - 1s - loss: 0.2692 - recall: 0.6021 - val_loss: 0.4416 - val_recall: 0.4540 - 1s/epoch - 3ms/step
Epoch 86/100
350/350 - 1s - loss: 0.2726 - recall: 0.5977 - val_loss: 0.4459 - val_recall: 0.4581 - 1s/epoch - 4ms/step
Epoch 87/100
350/350 - 1s - loss: 0.2673 - recall: 0.5977 - val_loss: 0.4366 - val_recall: 0.4581 - 1s/epoch - 4ms/step
Epoch 88/100
350/350 - 2s - loss: 0.2753 - recall: 0.6030 - val_loss: 0.4445 - val_recall: 0.4601 - 2s/epoch - 5ms/step
Epoch 89/100
350/350 - 1s - loss: 0.2746 - recall: 0.6056 - val_loss: 0.4500 - val_recall: 0.4642 - 1s/epoch - 4ms/step
Epoch 90/100
350/350 - 1s - loss: 0.2694 - recall: 0.5925 - val_loss: 0.4433 - val_recall: 0.4499 - 1s/epoch - 3ms/step
Epoch 91/100
350/350 - 1s - loss: 0.2668 - recall: 0.6065 - val_loss: 0.4449 - val_recall: 0.4499 - 998ms/epoch - 3ms/step
Epoch 92/100
350/350 - 1s - loss: 0.2662 - recall: 0.6135 - val_loss: 0.4443 - val_recall: 0.4315 - 898ms/epoch - 3ms/step
Epoch 93/100
350/350 - 1s - loss: 0.2587 - recall: 0.6249 - val_loss: 0.4554 - val_recall: 0.4642 - 893ms/epoch - 3ms/step
Epoch 94/100
350/350 - 1s - loss: 0.2652 - recall: 0.6152 - val_loss: 0.4595 - val_recall: 0.4458 - 904ms/epoch - 3ms/step
Epoch 95/100
350/350 - 1s - loss: 0.2613 - recall: 0.6363 - val_loss: 0.4585 - val_recall: 0.4458 - 978ms/epoch - 3ms/step
Epoch 96/100
350/350 - 1s - loss: 0.2610 - recall: 0.6205 - val_loss: 0.4614 - val_recall: 0.4540 - 896ms/epoch - 3ms/step
Epoch 97/100
350/350 - 1s - loss: 0.2677 - recall: 0.6021 - val_loss: 0.4525 - val_recall: 0.4376 - 1s/epoch - 3ms/step
Epoch 98/100
350/350 - 1s - loss: 0.2602 - recall: 0.6170 - val_loss: 0.4504 - val_recall: 0.4663 - 937ms/epoch - 3ms/step
Epoch 99/100
350/350 - 1s - loss: 0.2583 - recall: 0.6275 - val_loss: 0.4591 - val_recall: 0.4560 - 1s/epoch - 3ms/step
Epoch 100/100
350/350 - 2s - loss: 0.2605 - recall: 0.6266 - val_loss: 0.4622 - val_recall: 0.4560 - 2s/epoch - 4ms/step
plot(results.iloc[i]['history'],'loss')
plot(results.iloc[i]['history'],'recall')
results.iloc[i]
# hidden layers 2 # neurons - hidden layer [128, 128, 64] activation function - hidden layer [relu, relu] # epochs 100 batch size 16 optimizer SGD-Mom learning rate, momentum, dropout [0.001, 0.5, 0] weight initializer he_uniform regularization - train loss 0.260472 validation loss 0.462195 train recall 0.626643 validation recall 0.456033 time (secs) 143.31 model <keras.src.engine.sequential.Sequential object... history <keras.src.callbacks.History object at 0x7dfce... Name: 2, dtype: object
i+=1
model_fit('relu','relu','RMS',X_train,y_train,100,i,learning_rte=1e-4,momentumval=0.75)
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 128) 1536
dense_1 (Dense) (None, 128) 16512
batch_normalization (Batch (None, 128) 512
Normalization)
dense_2 (Dense) (None, 64) 8256
batch_normalization_1 (Bat (None, 64) 256
chNormalization)
dense_3 (Dense) (None, 1) 65
=================================================================
Total params: 27137 (106.00 KB)
Trainable params: 26753 (104.50 KB)
Non-trainable params: 384 (1.50 KB)
_________________________________________________________________
Epoch 1/100
350/350 - 3s - loss: 0.5667 - recall: 0.4496 - val_loss: 0.4296 - val_recall: 0.3517 - 3s/epoch - 8ms/step
Epoch 2/100
350/350 - 1s - loss: 0.4134 - recall: 0.3655 - val_loss: 0.4045 - val_recall: 0.4192 - 1s/epoch - 3ms/step
Epoch 3/100
350/350 - 1s - loss: 0.3790 - recall: 0.4145 - val_loss: 0.3972 - val_recall: 0.3661 - 1s/epoch - 3ms/step
Epoch 4/100
350/350 - 1s - loss: 0.3648 - recall: 0.4286 - val_loss: 0.3931 - val_recall: 0.3926 - 1s/epoch - 3ms/step
Epoch 5/100
350/350 - 1s - loss: 0.3586 - recall: 0.4584 - val_loss: 0.3897 - val_recall: 0.4110 - 989ms/epoch - 3ms/step
Epoch 6/100
350/350 - 1s - loss: 0.3473 - recall: 0.4680 - val_loss: 0.3862 - val_recall: 0.3824 - 1s/epoch - 3ms/step
Epoch 7/100
350/350 - 1s - loss: 0.3381 - recall: 0.4934 - val_loss: 0.4043 - val_recall: 0.4479 - 1s/epoch - 3ms/step
Epoch 8/100
350/350 - 2s - loss: 0.3383 - recall: 0.4908 - val_loss: 0.3879 - val_recall: 0.4376 - 2s/epoch - 6ms/step
Epoch 9/100
350/350 - 2s - loss: 0.3360 - recall: 0.4864 - val_loss: 0.3903 - val_recall: 0.4683 - 2s/epoch - 7ms/step
Epoch 10/100
350/350 - 1s - loss: 0.3284 - recall: 0.4926 - val_loss: 0.4094 - val_recall: 0.3763 - 1s/epoch - 3ms/step
Epoch 11/100
350/350 - 1s - loss: 0.3235 - recall: 0.5188 - val_loss: 0.4086 - val_recall: 0.4560 - 1s/epoch - 3ms/step
Epoch 12/100
350/350 - 1s - loss: 0.3168 - recall: 0.5504 - val_loss: 0.4013 - val_recall: 0.4254 - 1s/epoch - 3ms/step
Epoch 13/100
350/350 - 1s - loss: 0.3096 - recall: 0.5451 - val_loss: 0.4083 - val_recall: 0.4499 - 1s/epoch - 3ms/step
Epoch 14/100
350/350 - 1s - loss: 0.3108 - recall: 0.5548 - val_loss: 0.4157 - val_recall: 0.4540 - 1s/epoch - 3ms/step
Epoch 15/100
350/350 - 1s - loss: 0.3047 - recall: 0.5443 - val_loss: 0.4232 - val_recall: 0.4683 - 1s/epoch - 3ms/step
Epoch 16/100
350/350 - 1s - loss: 0.3038 - recall: 0.5583 - val_loss: 0.4082 - val_recall: 0.4438 - 1s/epoch - 4ms/step
Epoch 17/100
350/350 - 1s - loss: 0.3017 - recall: 0.5670 - val_loss: 0.4121 - val_recall: 0.4601 - 978ms/epoch - 3ms/step
Epoch 18/100
350/350 - 1s - loss: 0.2917 - recall: 0.5784 - val_loss: 0.4118 - val_recall: 0.4499 - 1s/epoch - 3ms/step
Epoch 19/100
350/350 - 2s - loss: 0.2959 - recall: 0.5478 - val_loss: 0.4229 - val_recall: 0.4376 - 2s/epoch - 6ms/step
Epoch 20/100
350/350 - 2s - loss: 0.2859 - recall: 0.5863 - val_loss: 0.4056 - val_recall: 0.4744 - 2s/epoch - 6ms/step
Epoch 21/100
350/350 - 1s - loss: 0.2867 - recall: 0.5933 - val_loss: 0.4243 - val_recall: 0.4356 - 1s/epoch - 3ms/step
Epoch 22/100
350/350 - 1s - loss: 0.2806 - recall: 0.6012 - val_loss: 0.4172 - val_recall: 0.4703 - 1s/epoch - 4ms/step
Epoch 23/100
350/350 - 1s - loss: 0.2851 - recall: 0.5942 - val_loss: 0.4269 - val_recall: 0.4826 - 1s/epoch - 4ms/step
Epoch 24/100
350/350 - 1s - loss: 0.2764 - recall: 0.6047 - val_loss: 0.4115 - val_recall: 0.4642 - 1s/epoch - 3ms/step
Epoch 25/100
350/350 - 1s - loss: 0.2651 - recall: 0.6188 - val_loss: 0.4365 - val_recall: 0.4703 - 1s/epoch - 3ms/step
Epoch 26/100
350/350 - 1s - loss: 0.2755 - recall: 0.6100 - val_loss: 0.4338 - val_recall: 0.4601 - 1s/epoch - 3ms/step
Epoch 27/100
350/350 - 1s - loss: 0.2765 - recall: 0.6240 - val_loss: 0.4548 - val_recall: 0.4335 - 1s/epoch - 3ms/step
Epoch 28/100
350/350 - 1s - loss: 0.2728 - recall: 0.6100 - val_loss: 0.4347 - val_recall: 0.4765 - 1s/epoch - 3ms/step
Epoch 29/100
350/350 - 2s - loss: 0.2639 - recall: 0.6345 - val_loss: 0.4468 - val_recall: 0.5072 - 2s/epoch - 5ms/step
Epoch 30/100
350/350 - 2s - loss: 0.2638 - recall: 0.6126 - val_loss: 0.4400 - val_recall: 0.5215 - 2s/epoch - 6ms/step
Epoch 31/100
350/350 - 2s - loss: 0.2600 - recall: 0.6240 - val_loss: 0.4807 - val_recall: 0.5133 - 2s/epoch - 6ms/step
Epoch 32/100
350/350 - 1s - loss: 0.2558 - recall: 0.6503 - val_loss: 0.4802 - val_recall: 0.4499 - 1s/epoch - 3ms/step
Epoch 33/100
350/350 - 1s - loss: 0.2556 - recall: 0.6582 - val_loss: 0.4489 - val_recall: 0.4642 - 1s/epoch - 3ms/step
Epoch 34/100
350/350 - 1s - loss: 0.2408 - recall: 0.6713 - val_loss: 0.4730 - val_recall: 0.4785 - 1s/epoch - 3ms/step
Epoch 35/100
350/350 - 1s - loss: 0.2557 - recall: 0.6442 - val_loss: 0.4724 - val_recall: 0.4785 - 1s/epoch - 3ms/step
Epoch 36/100
350/350 - 1s - loss: 0.2529 - recall: 0.6389 - val_loss: 0.4797 - val_recall: 0.4683 - 1s/epoch - 3ms/step
Epoch 37/100
350/350 - 1s - loss: 0.2493 - recall: 0.6538 - val_loss: 0.4648 - val_recall: 0.4376 - 1s/epoch - 3ms/step
Epoch 38/100
350/350 - 1s - loss: 0.2473 - recall: 0.6529 - val_loss: 0.4826 - val_recall: 0.4724 - 1s/epoch - 4ms/step
Epoch 39/100
350/350 - 1s - loss: 0.2349 - recall: 0.6827 - val_loss: 0.4806 - val_recall: 0.4908 - 1s/epoch - 3ms/step
Epoch 40/100
350/350 - 2s - loss: 0.2445 - recall: 0.6766 - val_loss: 0.4809 - val_recall: 0.4908 - 2s/epoch - 5ms/step
Epoch 41/100
350/350 - 2s - loss: 0.2381 - recall: 0.6871 - val_loss: 0.4855 - val_recall: 0.4417 - 2s/epoch - 6ms/step
Epoch 42/100
350/350 - 2s - loss: 0.2389 - recall: 0.6652 - val_loss: 0.4950 - val_recall: 0.4397 - 2s/epoch - 4ms/step
Epoch 43/100
350/350 - 1s - loss: 0.2330 - recall: 0.6766 - val_loss: 0.4826 - val_recall: 0.4847 - 1s/epoch - 3ms/step
Epoch 44/100
350/350 - 1s - loss: 0.2290 - recall: 0.6810 - val_loss: 0.4911 - val_recall: 0.4847 - 1s/epoch - 3ms/step
Epoch 45/100
350/350 - 1s - loss: 0.2255 - recall: 0.6950 - val_loss: 0.5163 - val_recall: 0.4744 - 1s/epoch - 4ms/step
Epoch 46/100
350/350 - 1s - loss: 0.2266 - recall: 0.6862 - val_loss: 0.5466 - val_recall: 0.4376 - 1s/epoch - 3ms/step
Epoch 47/100
350/350 - 1s - loss: 0.2327 - recall: 0.6915 - val_loss: 0.5059 - val_recall: 0.4888 - 1s/epoch - 3ms/step
Epoch 48/100
350/350 - 1s - loss: 0.2330 - recall: 0.6801 - val_loss: 0.5088 - val_recall: 0.4458 - 1s/epoch - 3ms/step
Epoch 49/100
350/350 - 1s - loss: 0.2217 - recall: 0.7160 - val_loss: 0.5303 - val_recall: 0.5481 - 979ms/epoch - 3ms/step
Epoch 50/100
350/350 - 1s - loss: 0.2134 - recall: 0.7073 - val_loss: 0.5439 - val_recall: 0.4438 - 1s/epoch - 3ms/step
Epoch 51/100
350/350 - 2s - loss: 0.2177 - recall: 0.7038 - val_loss: 0.5248 - val_recall: 0.4867 - 2s/epoch - 6ms/step
Epoch 52/100
350/350 - 2s - loss: 0.2129 - recall: 0.7213 - val_loss: 0.5252 - val_recall: 0.5051 - 2s/epoch - 6ms/step
Epoch 53/100
350/350 - 1s - loss: 0.2130 - recall: 0.7318 - val_loss: 0.5142 - val_recall: 0.4703 - 1s/epoch - 3ms/step
Epoch 54/100
350/350 - 1s - loss: 0.2124 - recall: 0.7169 - val_loss: 0.5482 - val_recall: 0.4622 - 1s/epoch - 3ms/step
Epoch 55/100
350/350 - 1s - loss: 0.2054 - recall: 0.7187 - val_loss: 0.5886 - val_recall: 0.5562 - 1s/epoch - 3ms/step
Epoch 56/100
350/350 - 1s - loss: 0.2168 - recall: 0.7046 - val_loss: 0.5616 - val_recall: 0.4785 - 1s/epoch - 4ms/step
Epoch 57/100
350/350 - 1s - loss: 0.2151 - recall: 0.7125 - val_loss: 0.5578 - val_recall: 0.4908 - 1s/epoch - 3ms/step
Epoch 58/100
350/350 - 1s - loss: 0.1952 - recall: 0.7362 - val_loss: 0.5768 - val_recall: 0.4663 - 1s/epoch - 3ms/step
Epoch 59/100
350/350 - 1s - loss: 0.2047 - recall: 0.7274 - val_loss: 0.5582 - val_recall: 0.4703 - 1s/epoch - 3ms/step
Epoch 60/100
350/350 - 1s - loss: 0.1960 - recall: 0.7599 - val_loss: 0.5607 - val_recall: 0.5092 - 1s/epoch - 4ms/step
Epoch 61/100
350/350 - 2s - loss: 0.2063 - recall: 0.7292 - val_loss: 0.5867 - val_recall: 0.4458 - 2s/epoch - 5ms/step
Epoch 62/100
350/350 - 2s - loss: 0.2004 - recall: 0.7458 - val_loss: 0.5839 - val_recall: 0.4867 - 2s/epoch - 6ms/step
Epoch 63/100
350/350 - 2s - loss: 0.1938 - recall: 0.7528 - val_loss: 0.5701 - val_recall: 0.4663 - 2s/epoch - 5ms/step
Epoch 64/100
350/350 - 1s - loss: 0.1988 - recall: 0.7616 - val_loss: 0.6143 - val_recall: 0.4908 - 1s/epoch - 3ms/step
Epoch 65/100
350/350 - 1s - loss: 0.2035 - recall: 0.7362 - val_loss: 0.5937 - val_recall: 0.4683 - 1s/epoch - 4ms/step
Epoch 66/100
350/350 - 1s - loss: 0.1945 - recall: 0.7485 - val_loss: 0.6113 - val_recall: 0.4294 - 1s/epoch - 4ms/step
Epoch 67/100
350/350 - 1s - loss: 0.1976 - recall: 0.7441 - val_loss: 0.5992 - val_recall: 0.4254 - 1s/epoch - 4ms/step
Epoch 68/100
350/350 - 1s - loss: 0.1853 - recall: 0.7730 - val_loss: 0.6161 - val_recall: 0.4438 - 1s/epoch - 3ms/step
Epoch 69/100
350/350 - 1s - loss: 0.1957 - recall: 0.7528 - val_loss: 0.6048 - val_recall: 0.4949 - 1s/epoch - 3ms/step
Epoch 70/100
350/350 - 1s - loss: 0.1898 - recall: 0.7555 - val_loss: 0.6206 - val_recall: 0.4765 - 1s/epoch - 3ms/step
Epoch 71/100
350/350 - 1s - loss: 0.1891 - recall: 0.7651 - val_loss: 0.6123 - val_recall: 0.4928 - 967ms/epoch - 3ms/step
Epoch 72/100
350/350 - 2s - loss: 0.1805 - recall: 0.7581 - val_loss: 0.6287 - val_recall: 0.4765 - 2s/epoch - 5ms/step
Epoch 73/100
350/350 - 2s - loss: 0.1784 - recall: 0.7835 - val_loss: 0.6053 - val_recall: 0.4622 - 2s/epoch - 6ms/step
Epoch 74/100
350/350 - 2s - loss: 0.1849 - recall: 0.7564 - val_loss: 0.6145 - val_recall: 0.4458 - 2s/epoch - 6ms/step
Epoch 75/100
350/350 - 1s - loss: 0.1805 - recall: 0.7835 - val_loss: 0.6251 - val_recall: 0.4928 - 1s/epoch - 3ms/step
Epoch 76/100
350/350 - 1s - loss: 0.1790 - recall: 0.7721 - val_loss: 0.6375 - val_recall: 0.5174 - 1s/epoch - 3ms/step
Epoch 77/100
350/350 - 1s - loss: 0.1737 - recall: 0.7748 - val_loss: 0.6528 - val_recall: 0.4294 - 1s/epoch - 3ms/step
Epoch 78/100
350/350 - 1s - loss: 0.1757 - recall: 0.7634 - val_loss: 0.6577 - val_recall: 0.4438 - 1s/epoch - 3ms/step
Epoch 79/100
350/350 - 1s - loss: 0.1784 - recall: 0.7765 - val_loss: 0.6530 - val_recall: 0.4458 - 1s/epoch - 3ms/step
Epoch 80/100
350/350 - 1s - loss: 0.1770 - recall: 0.7748 - val_loss: 0.6636 - val_recall: 0.4356 - 1s/epoch - 3ms/step
Epoch 81/100
350/350 - 1s - loss: 0.1791 - recall: 0.7730 - val_loss: 0.6222 - val_recall: 0.5072 - 1s/epoch - 3ms/step
Epoch 82/100
350/350 - 1s - loss: 0.1739 - recall: 0.7818 - val_loss: 0.6657 - val_recall: 0.4438 - 1s/epoch - 3ms/step
Epoch 83/100
350/350 - 2s - loss: 0.1668 - recall: 0.7853 - val_loss: 0.6755 - val_recall: 0.4724 - 2s/epoch - 5ms/step
Epoch 84/100
350/350 - 2s - loss: 0.1707 - recall: 0.7835 - val_loss: 0.6431 - val_recall: 0.4703 - 2s/epoch - 6ms/step
Epoch 85/100
350/350 - 2s - loss: 0.1751 - recall: 0.7862 - val_loss: 0.6496 - val_recall: 0.4867 - 2s/epoch - 6ms/step
Epoch 86/100
350/350 - 1s - loss: 0.1743 - recall: 0.7818 - val_loss: 0.6937 - val_recall: 0.4765 - 1s/epoch - 3ms/step
Epoch 87/100
350/350 - 1s - loss: 0.1693 - recall: 0.7905 - val_loss: 0.6781 - val_recall: 0.5072 - 1s/epoch - 3ms/step
Epoch 88/100
350/350 - 1s - loss: 0.1678 - recall: 0.7879 - val_loss: 0.7143 - val_recall: 0.4499 - 1s/epoch - 3ms/step
Epoch 89/100
350/350 - 1s - loss: 0.1686 - recall: 0.7774 - val_loss: 0.6598 - val_recall: 0.5153 - 1s/epoch - 4ms/step
Epoch 90/100
350/350 - 1s - loss: 0.1693 - recall: 0.7826 - val_loss: 0.6774 - val_recall: 0.4928 - 1s/epoch - 4ms/step
Epoch 91/100
350/350 - 1s - loss: 0.1603 - recall: 0.8028 - val_loss: 0.7130 - val_recall: 0.4151 - 1s/epoch - 3ms/step
Epoch 92/100
350/350 - 1s - loss: 0.1615 - recall: 0.8107 - val_loss: 0.7293 - val_recall: 0.4847 - 1s/epoch - 3ms/step
Epoch 93/100
350/350 - 1s - loss: 0.1615 - recall: 0.8037 - val_loss: 0.6946 - val_recall: 0.4540 - 1s/epoch - 3ms/step
Epoch 94/100
350/350 - 1s - loss: 0.1635 - recall: 0.7984 - val_loss: 0.6929 - val_recall: 0.5153 - 1s/epoch - 4ms/step
Epoch 95/100
350/350 - 2s - loss: 0.1614 - recall: 0.7993 - val_loss: 0.6817 - val_recall: 0.4254 - 2s/epoch - 6ms/step
Epoch 96/100
350/350 - 2s - loss: 0.1561 - recall: 0.8019 - val_loss: 0.7359 - val_recall: 0.4622 - 2s/epoch - 5ms/step
Epoch 97/100
350/350 - 1s - loss: 0.1598 - recall: 0.8081 - val_loss: 0.7011 - val_recall: 0.4744 - 1s/epoch - 3ms/step
Epoch 98/100
350/350 - 1s - loss: 0.1597 - recall: 0.7984 - val_loss: 0.7218 - val_recall: 0.5235 - 1s/epoch - 3ms/step
Epoch 99/100
350/350 - 1s - loss: 0.1551 - recall: 0.8116 - val_loss: 0.7395 - val_recall: 0.4806 - 1s/epoch - 3ms/step
Epoch 100/100
350/350 - 1s - loss: 0.1532 - recall: 0.8168 - val_loss: 0.7255 - val_recall: 0.4826 - 1s/epoch - 3ms/step
plot(results.iloc[i]['history'],'loss')
plot(results.iloc[i]['history'],'recall')
results.iloc[i]
# hidden layers 2 # neurons - hidden layer [128, 128, 64] activation function - hidden layer [relu, relu] # epochs 100 batch size 16 optimizer RMS learning rate, momentum, dropout [0.0001, 0.75, 0] weight initializer he_uniform regularization - train loss 0.153165 validation loss 0.725464 train recall 0.816827 validation recall 0.482618 time (secs) 134.9 model <keras.src.engine.sequential.Sequential object... history <keras.src.callbacks.History object at 0x7dfce... Name: 3, dtype: object
i+=1
model_fit('relu','relu','Adam',X_train,y_train,100,i,learning_rte=1e-4)
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 128) 1536
dense_1 (Dense) (None, 128) 16512
batch_normalization (Batch (None, 128) 512
Normalization)
dense_2 (Dense) (None, 64) 8256
batch_normalization_1 (Bat (None, 64) 256
chNormalization)
dense_3 (Dense) (None, 1) 65
=================================================================
Total params: 27137 (106.00 KB)
Trainable params: 26753 (104.50 KB)
Non-trainable params: 384 (1.50 KB)
_________________________________________________________________
Epoch 1/100
350/350 - 3s - loss: 0.7358 - recall: 0.5267 - val_loss: 0.5805 - val_recall: 0.5194 - 3s/epoch - 8ms/step
Epoch 2/100
350/350 - 1s - loss: 0.5687 - recall: 0.5408 - val_loss: 0.5452 - val_recall: 0.5276 - 1s/epoch - 4ms/step
Epoch 3/100
350/350 - 2s - loss: 0.4935 - recall: 0.5066 - val_loss: 0.4826 - val_recall: 0.4703 - 2s/epoch - 5ms/step
Epoch 4/100
350/350 - 2s - loss: 0.4468 - recall: 0.4514 - val_loss: 0.4456 - val_recall: 0.4397 - 2s/epoch - 6ms/step
Epoch 5/100
350/350 - 1s - loss: 0.4203 - recall: 0.4242 - val_loss: 0.4328 - val_recall: 0.4233 - 1s/epoch - 3ms/step
Epoch 6/100
350/350 - 1s - loss: 0.4017 - recall: 0.4321 - val_loss: 0.4198 - val_recall: 0.4131 - 1s/epoch - 3ms/step
Epoch 7/100
350/350 - 1s - loss: 0.3856 - recall: 0.4075 - val_loss: 0.4116 - val_recall: 0.4049 - 1s/epoch - 3ms/step
Epoch 8/100
350/350 - 1s - loss: 0.3794 - recall: 0.4277 - val_loss: 0.4075 - val_recall: 0.4131 - 1s/epoch - 3ms/step
Epoch 9/100
350/350 - 1s - loss: 0.3660 - recall: 0.4154 - val_loss: 0.3998 - val_recall: 0.4049 - 1s/epoch - 4ms/step
Epoch 10/100
350/350 - 1s - loss: 0.3617 - recall: 0.4181 - val_loss: 0.4005 - val_recall: 0.3967 - 1s/epoch - 4ms/step
Epoch 11/100
350/350 - 1s - loss: 0.3562 - recall: 0.4435 - val_loss: 0.3979 - val_recall: 0.4192 - 1s/epoch - 3ms/step
Epoch 12/100
350/350 - 1s - loss: 0.3446 - recall: 0.4610 - val_loss: 0.4006 - val_recall: 0.4315 - 1s/epoch - 3ms/step
Epoch 13/100
350/350 - 1s - loss: 0.3415 - recall: 0.4706 - val_loss: 0.3952 - val_recall: 0.4335 - 1s/epoch - 3ms/step
Epoch 14/100
350/350 - 2s - loss: 0.3350 - recall: 0.4698 - val_loss: 0.3941 - val_recall: 0.4335 - 2s/epoch - 6ms/step
Epoch 15/100
350/350 - 2s - loss: 0.3314 - recall: 0.4671 - val_loss: 0.3933 - val_recall: 0.4417 - 2s/epoch - 6ms/step
Epoch 16/100
350/350 - 1s - loss: 0.3273 - recall: 0.4864 - val_loss: 0.3967 - val_recall: 0.4519 - 1s/epoch - 3ms/step
Epoch 17/100
350/350 - 1s - loss: 0.3223 - recall: 0.4987 - val_loss: 0.3961 - val_recall: 0.4458 - 1s/epoch - 4ms/step
Epoch 18/100
350/350 - 1s - loss: 0.3155 - recall: 0.5188 - val_loss: 0.3967 - val_recall: 0.4601 - 1s/epoch - 3ms/step
Epoch 19/100
350/350 - 1s - loss: 0.3197 - recall: 0.4890 - val_loss: 0.4043 - val_recall: 0.4438 - 1s/epoch - 3ms/step
Epoch 20/100
350/350 - 1s - loss: 0.3086 - recall: 0.5118 - val_loss: 0.3987 - val_recall: 0.4560 - 1s/epoch - 3ms/step
Epoch 21/100
350/350 - 1s - loss: 0.3099 - recall: 0.5171 - val_loss: 0.4056 - val_recall: 0.4560 - 1s/epoch - 3ms/step
Epoch 22/100
350/350 - 1s - loss: 0.3027 - recall: 0.5294 - val_loss: 0.4089 - val_recall: 0.4540 - 1s/epoch - 3ms/step
Epoch 23/100
350/350 - 1s - loss: 0.3094 - recall: 0.5145 - val_loss: 0.4100 - val_recall: 0.4499 - 1s/epoch - 3ms/step
Epoch 24/100
350/350 - 1s - loss: 0.2921 - recall: 0.5539 - val_loss: 0.4117 - val_recall: 0.4581 - 1s/epoch - 3ms/step
Epoch 25/100
350/350 - 2s - loss: 0.2889 - recall: 0.5600 - val_loss: 0.4101 - val_recall: 0.4519 - 2s/epoch - 6ms/step
Epoch 26/100
350/350 - 2s - loss: 0.2938 - recall: 0.5635 - val_loss: 0.4226 - val_recall: 0.4397 - 2s/epoch - 6ms/step
Epoch 27/100
350/350 - 1s - loss: 0.2946 - recall: 0.5609 - val_loss: 0.4139 - val_recall: 0.4560 - 1s/epoch - 4ms/step
Epoch 28/100
350/350 - 1s - loss: 0.2917 - recall: 0.5583 - val_loss: 0.4221 - val_recall: 0.4744 - 1s/epoch - 3ms/step
Epoch 29/100
350/350 - 1s - loss: 0.2856 - recall: 0.5679 - val_loss: 0.4187 - val_recall: 0.4683 - 1s/epoch - 3ms/step
Epoch 30/100
350/350 - 1s - loss: 0.2894 - recall: 0.5513 - val_loss: 0.4214 - val_recall: 0.4683 - 1s/epoch - 3ms/step
Epoch 31/100
350/350 - 1s - loss: 0.2868 - recall: 0.5758 - val_loss: 0.4293 - val_recall: 0.4519 - 1s/epoch - 4ms/step
Epoch 32/100
350/350 - 1s - loss: 0.2771 - recall: 0.5881 - val_loss: 0.4285 - val_recall: 0.4622 - 1s/epoch - 4ms/step
Epoch 33/100
350/350 - 1s - loss: 0.2769 - recall: 0.5793 - val_loss: 0.4294 - val_recall: 0.4663 - 1s/epoch - 4ms/step
Epoch 34/100
350/350 - 1s - loss: 0.2685 - recall: 0.6144 - val_loss: 0.4302 - val_recall: 0.4847 - 1s/epoch - 4ms/step
Epoch 35/100
350/350 - 2s - loss: 0.2773 - recall: 0.5881 - val_loss: 0.4318 - val_recall: 0.4703 - 2s/epoch - 5ms/step
Epoch 36/100
350/350 - 2s - loss: 0.2706 - recall: 0.5968 - val_loss: 0.4340 - val_recall: 0.4826 - 2s/epoch - 6ms/step
Epoch 37/100
350/350 - 1s - loss: 0.2661 - recall: 0.6091 - val_loss: 0.4417 - val_recall: 0.4663 - 1s/epoch - 4ms/step
Epoch 38/100
350/350 - 1s - loss: 0.2630 - recall: 0.6188 - val_loss: 0.4442 - val_recall: 0.4703 - 1s/epoch - 3ms/step
Epoch 39/100
350/350 - 1s - loss: 0.2636 - recall: 0.6135 - val_loss: 0.4449 - val_recall: 0.4724 - 1s/epoch - 4ms/step
Epoch 40/100
350/350 - 1s - loss: 0.2621 - recall: 0.6231 - val_loss: 0.4442 - val_recall: 0.4806 - 1s/epoch - 3ms/step
Epoch 41/100
350/350 - 1s - loss: 0.2562 - recall: 0.6266 - val_loss: 0.4642 - val_recall: 0.4765 - 1s/epoch - 4ms/step
Epoch 42/100
350/350 - 1s - loss: 0.2598 - recall: 0.6214 - val_loss: 0.4609 - val_recall: 0.4458 - 1s/epoch - 3ms/step
Epoch 43/100
350/350 - 1s - loss: 0.2529 - recall: 0.6258 - val_loss: 0.4576 - val_recall: 0.4560 - 1s/epoch - 4ms/step
Epoch 44/100
350/350 - 1s - loss: 0.2474 - recall: 0.6337 - val_loss: 0.4586 - val_recall: 0.4663 - 1s/epoch - 3ms/step
Epoch 45/100
350/350 - 1s - loss: 0.2437 - recall: 0.6626 - val_loss: 0.4622 - val_recall: 0.4601 - 1s/epoch - 4ms/step
Epoch 46/100
350/350 - 2s - loss: 0.2472 - recall: 0.6468 - val_loss: 0.4723 - val_recall: 0.4765 - 2s/epoch - 6ms/step
Epoch 47/100
350/350 - 2s - loss: 0.2450 - recall: 0.6450 - val_loss: 0.4693 - val_recall: 0.4826 - 2s/epoch - 6ms/step
Epoch 48/100
350/350 - 1s - loss: 0.2476 - recall: 0.6345 - val_loss: 0.4711 - val_recall: 0.4663 - 1s/epoch - 3ms/step
Epoch 49/100
350/350 - 1s - loss: 0.2383 - recall: 0.6503 - val_loss: 0.4828 - val_recall: 0.4744 - 1s/epoch - 3ms/step
Epoch 50/100
350/350 - 1s - loss: 0.2341 - recall: 0.6556 - val_loss: 0.4702 - val_recall: 0.4908 - 1s/epoch - 3ms/step
Epoch 51/100
350/350 - 1s - loss: 0.2379 - recall: 0.6354 - val_loss: 0.4824 - val_recall: 0.4601 - 1s/epoch - 3ms/step
Epoch 52/100
350/350 - 1s - loss: 0.2317 - recall: 0.6617 - val_loss: 0.4893 - val_recall: 0.4949 - 1s/epoch - 3ms/step
Epoch 53/100
350/350 - 1s - loss: 0.2411 - recall: 0.6424 - val_loss: 0.4862 - val_recall: 0.4785 - 1s/epoch - 3ms/step
Epoch 54/100
350/350 - 1s - loss: 0.2279 - recall: 0.6880 - val_loss: 0.4870 - val_recall: 0.4724 - 1s/epoch - 3ms/step
Epoch 55/100
350/350 - 1s - loss: 0.2222 - recall: 0.6976 - val_loss: 0.4899 - val_recall: 0.4724 - 1s/epoch - 3ms/step
Epoch 56/100
350/350 - 2s - loss: 0.2275 - recall: 0.6766 - val_loss: 0.5017 - val_recall: 0.4663 - 2s/epoch - 5ms/step
Epoch 57/100
350/350 - 2s - loss: 0.2245 - recall: 0.6968 - val_loss: 0.4956 - val_recall: 0.4744 - 2s/epoch - 6ms/step
Epoch 58/100
350/350 - 2s - loss: 0.2173 - recall: 0.6871 - val_loss: 0.5132 - val_recall: 0.4642 - 2s/epoch - 5ms/step
Epoch 59/100
350/350 - 1s - loss: 0.2222 - recall: 0.6792 - val_loss: 0.5046 - val_recall: 0.4703 - 1s/epoch - 3ms/step
Epoch 60/100
350/350 - 1s - loss: 0.2114 - recall: 0.7055 - val_loss: 0.5194 - val_recall: 0.4744 - 1s/epoch - 3ms/step
Epoch 61/100
350/350 - 1s - loss: 0.2135 - recall: 0.7169 - val_loss: 0.5130 - val_recall: 0.4622 - 1s/epoch - 3ms/step
Epoch 62/100
350/350 - 1s - loss: 0.2176 - recall: 0.7020 - val_loss: 0.5143 - val_recall: 0.4683 - 1s/epoch - 3ms/step
Epoch 63/100
350/350 - 1s - loss: 0.2255 - recall: 0.6731 - val_loss: 0.5231 - val_recall: 0.4703 - 1s/epoch - 3ms/step
Epoch 64/100
350/350 - 1s - loss: 0.2158 - recall: 0.7099 - val_loss: 0.5269 - val_recall: 0.4581 - 1s/epoch - 3ms/step
Epoch 65/100
350/350 - 1s - loss: 0.2117 - recall: 0.7117 - val_loss: 0.5155 - val_recall: 0.4744 - 1s/epoch - 3ms/step
Epoch 66/100
350/350 - 1s - loss: 0.2121 - recall: 0.7073 - val_loss: 0.5264 - val_recall: 0.4867 - 1s/epoch - 3ms/step
Epoch 67/100
350/350 - 2s - loss: 0.2089 - recall: 0.7187 - val_loss: 0.5347 - val_recall: 0.4581 - 2s/epoch - 5ms/step
Epoch 68/100
350/350 - 2s - loss: 0.2008 - recall: 0.7371 - val_loss: 0.5417 - val_recall: 0.4765 - 2s/epoch - 6ms/step
Epoch 69/100
350/350 - 2s - loss: 0.2105 - recall: 0.6976 - val_loss: 0.5285 - val_recall: 0.4785 - 2s/epoch - 6ms/step
Epoch 70/100
350/350 - 1s - loss: 0.2022 - recall: 0.7195 - val_loss: 0.5524 - val_recall: 0.4703 - 1s/epoch - 3ms/step
Epoch 71/100
350/350 - 1s - loss: 0.2086 - recall: 0.7099 - val_loss: 0.5429 - val_recall: 0.4601 - 1s/epoch - 3ms/step
Epoch 72/100
350/350 - 1s - loss: 0.1996 - recall: 0.7230 - val_loss: 0.5399 - val_recall: 0.4499 - 1s/epoch - 3ms/step
Epoch 73/100
350/350 - 1s - loss: 0.1976 - recall: 0.7406 - val_loss: 0.5465 - val_recall: 0.4724 - 992ms/epoch - 3ms/step
Epoch 74/100
350/350 - 1s - loss: 0.1992 - recall: 0.7327 - val_loss: 0.5471 - val_recall: 0.4806 - 1s/epoch - 3ms/step
Epoch 75/100
350/350 - 1s - loss: 0.1934 - recall: 0.7274 - val_loss: 0.5529 - val_recall: 0.4458 - 1s/epoch - 3ms/step
Epoch 76/100
350/350 - 1s - loss: 0.1889 - recall: 0.7511 - val_loss: 0.5435 - val_recall: 0.4622 - 1s/epoch - 4ms/step
Epoch 77/100
350/350 - 1s - loss: 0.1904 - recall: 0.7564 - val_loss: 0.5662 - val_recall: 0.4581 - 1s/epoch - 4ms/step
Epoch 78/100
350/350 - 2s - loss: 0.1905 - recall: 0.7274 - val_loss: 0.5723 - val_recall: 0.4683 - 2s/epoch - 6ms/step
Epoch 79/100
350/350 - 2s - loss: 0.1870 - recall: 0.7301 - val_loss: 0.5758 - val_recall: 0.4765 - 2s/epoch - 7ms/step
Epoch 80/100
350/350 - 2s - loss: 0.1859 - recall: 0.7432 - val_loss: 0.5660 - val_recall: 0.4438 - 2s/epoch - 4ms/step
Epoch 81/100
350/350 - 1s - loss: 0.1823 - recall: 0.7651 - val_loss: 0.5851 - val_recall: 0.4785 - 1s/epoch - 3ms/step
Epoch 82/100
350/350 - 1s - loss: 0.1829 - recall: 0.7546 - val_loss: 0.5789 - val_recall: 0.4683 - 1s/epoch - 3ms/step
Epoch 83/100
350/350 - 1s - loss: 0.1762 - recall: 0.7581 - val_loss: 0.5816 - val_recall: 0.4622 - 1s/epoch - 3ms/step
Epoch 84/100
350/350 - 1s - loss: 0.1838 - recall: 0.7458 - val_loss: 0.5720 - val_recall: 0.4703 - 1s/epoch - 3ms/step
Epoch 85/100
350/350 - 1s - loss: 0.1786 - recall: 0.7572 - val_loss: 0.5813 - val_recall: 0.4806 - 976ms/epoch - 3ms/step
Epoch 86/100
350/350 - 1s - loss: 0.1819 - recall: 0.7520 - val_loss: 0.5927 - val_recall: 0.4683 - 1s/epoch - 3ms/step
Epoch 87/100
350/350 - 1s - loss: 0.1790 - recall: 0.7607 - val_loss: 0.5790 - val_recall: 0.4847 - 1s/epoch - 3ms/step
Epoch 88/100
350/350 - 1s - loss: 0.1790 - recall: 0.7493 - val_loss: 0.5741 - val_recall: 0.4785 - 1s/epoch - 3ms/step
Epoch 89/100
350/350 - 2s - loss: 0.1800 - recall: 0.7537 - val_loss: 0.6114 - val_recall: 0.4683 - 2s/epoch - 5ms/step
Epoch 90/100
350/350 - 2s - loss: 0.1745 - recall: 0.7721 - val_loss: 0.5950 - val_recall: 0.4703 - 2s/epoch - 6ms/step
Epoch 91/100
350/350 - 2s - loss: 0.1752 - recall: 0.7625 - val_loss: 0.5952 - val_recall: 0.4622 - 2s/epoch - 5ms/step
Epoch 92/100
350/350 - 1s - loss: 0.1730 - recall: 0.7686 - val_loss: 0.6007 - val_recall: 0.4519 - 1s/epoch - 3ms/step
Epoch 93/100
350/350 - 1s - loss: 0.1644 - recall: 0.7783 - val_loss: 0.6106 - val_recall: 0.4724 - 1s/epoch - 3ms/step
Epoch 94/100
350/350 - 1s - loss: 0.1646 - recall: 0.7791 - val_loss: 0.6357 - val_recall: 0.4724 - 1s/epoch - 3ms/step
Epoch 95/100
350/350 - 1s - loss: 0.1620 - recall: 0.7914 - val_loss: 0.6070 - val_recall: 0.4479 - 1s/epoch - 3ms/step
Epoch 96/100
350/350 - 1s - loss: 0.1631 - recall: 0.7905 - val_loss: 0.6253 - val_recall: 0.4581 - 1s/epoch - 3ms/step
Epoch 97/100
350/350 - 2s - loss: 0.1666 - recall: 0.7765 - val_loss: 0.6048 - val_recall: 0.4642 - 2s/epoch - 5ms/step
Epoch 98/100
350/350 - 1s - loss: 0.1621 - recall: 0.7818 - val_loss: 0.6059 - val_recall: 0.4581 - 1s/epoch - 4ms/step
Epoch 99/100
350/350 - 2s - loss: 0.1604 - recall: 0.7975 - val_loss: 0.6245 - val_recall: 0.4601 - 2s/epoch - 5ms/step
Epoch 100/100
350/350 - 2s - loss: 0.1615 - recall: 0.7923 - val_loss: 0.6461 - val_recall: 0.4622 - 2s/epoch - 6ms/step
plot(results.iloc[i]['history'],'loss')
plot(results.iloc[i]['history'],'recall')
results.iloc[i]
# hidden layers 2 # neurons - hidden layer [128, 128, 64] activation function - hidden layer [relu, relu] # epochs 100 batch size 16 optimizer Adam learning rate, momentum, dropout [0.0001, 0.0, 0] weight initializer he_uniform regularization - train loss 0.161517 validation loss 0.64609 train recall 0.792287 validation recall 0.462168 time (secs) 143.6 model <keras.src.engine.sequential.Sequential object... history <keras.src.callbacks.History object at 0x7dfce... Name: 4, dtype: object
Changed the optimizer to Adam with a learning rate of 0.0001, expected the model to converge smoothly for both Training and validation. Even though the loss gradually reduced for training data the model showed very less improvemenent with validation data. Hence not suitable for the purpose.
Implementing two Dropouts ratios (0.2 and 0.3) in the model, for performance improvements.
i+=1
model_fit_with_dropout('relu','relu','Adam',X_train,y_train,100,i,dropoutval=0.2,learning_rte=1e-4) # dropout value is set at 0.2
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 128) 1536
dropout (Dropout) (None, 128) 0
dense_1 (Dense) (None, 128) 16512
batch_normalization (Batch (None, 128) 512
Normalization)
dropout_1 (Dropout) (None, 128) 0
dense_2 (Dense) (None, 64) 8256
batch_normalization_1 (Bat (None, 64) 256
chNormalization)
dropout_2 (Dropout) (None, 64) 0
dense_3 (Dense) (None, 1) 65
=================================================================
Total params: 27137 (106.00 KB)
Trainable params: 26753 (104.50 KB)
Non-trainable params: 384 (1.50 KB)
_________________________________________________________________
Epoch 1/100
350/350 - 3s - loss: 0.8486 - recall: 0.4987 - val_loss: 0.5596 - val_recall: 0.4172 - 3s/epoch - 10ms/step
Epoch 2/100
350/350 - 2s - loss: 0.6892 - recall: 0.4487 - val_loss: 0.5099 - val_recall: 0.3742 - 2s/epoch - 5ms/step
Epoch 3/100
350/350 - 2s - loss: 0.6117 - recall: 0.4408 - val_loss: 0.4679 - val_recall: 0.3272 - 2s/epoch - 7ms/step
Epoch 4/100
350/350 - 2s - loss: 0.5553 - recall: 0.3848 - val_loss: 0.4493 - val_recall: 0.3108 - 2s/epoch - 6ms/step
Epoch 5/100
350/350 - 1s - loss: 0.5249 - recall: 0.3471 - val_loss: 0.4353 - val_recall: 0.2822 - 1s/epoch - 4ms/step
Epoch 6/100
350/350 - 1s - loss: 0.5102 - recall: 0.3287 - val_loss: 0.4269 - val_recall: 0.2658 - 1s/epoch - 4ms/step
Epoch 7/100
350/350 - 1s - loss: 0.4978 - recall: 0.2989 - val_loss: 0.4193 - val_recall: 0.2413 - 1s/epoch - 4ms/step
Epoch 8/100
350/350 - 1s - loss: 0.4807 - recall: 0.2875 - val_loss: 0.4154 - val_recall: 0.2372 - 1s/epoch - 4ms/step
Epoch 9/100
350/350 - 1s - loss: 0.4818 - recall: 0.2743 - val_loss: 0.4091 - val_recall: 0.2209 - 1s/epoch - 4ms/step
Epoch 10/100
350/350 - 1s - loss: 0.4715 - recall: 0.2524 - val_loss: 0.4077 - val_recall: 0.2249 - 1s/epoch - 3ms/step
Epoch 11/100
350/350 - 1s - loss: 0.4720 - recall: 0.2656 - val_loss: 0.4042 - val_recall: 0.2270 - 1s/epoch - 3ms/step
Epoch 12/100
350/350 - 2s - loss: 0.4696 - recall: 0.2428 - val_loss: 0.4013 - val_recall: 0.2270 - 2s/epoch - 6ms/step
Epoch 13/100
350/350 - 2s - loss: 0.4582 - recall: 0.2585 - val_loss: 0.3986 - val_recall: 0.2372 - 2s/epoch - 7ms/step
Epoch 14/100
350/350 - 1s - loss: 0.4548 - recall: 0.2594 - val_loss: 0.3974 - val_recall: 0.2270 - 1s/epoch - 4ms/step
Epoch 15/100
350/350 - 1s - loss: 0.4490 - recall: 0.2673 - val_loss: 0.3963 - val_recall: 0.2209 - 1s/epoch - 4ms/step
Epoch 16/100
350/350 - 1s - loss: 0.4487 - recall: 0.2726 - val_loss: 0.3943 - val_recall: 0.2147 - 1s/epoch - 4ms/step
Epoch 17/100
350/350 - 1s - loss: 0.4427 - recall: 0.2918 - val_loss: 0.3901 - val_recall: 0.2393 - 1s/epoch - 4ms/step
Epoch 18/100
350/350 - 1s - loss: 0.4453 - recall: 0.2787 - val_loss: 0.3884 - val_recall: 0.2393 - 1s/epoch - 4ms/step
Epoch 19/100
350/350 - 1s - loss: 0.4400 - recall: 0.2734 - val_loss: 0.3880 - val_recall: 0.2720 - 1s/epoch - 4ms/step
Epoch 20/100
350/350 - 1s - loss: 0.4392 - recall: 0.3041 - val_loss: 0.3844 - val_recall: 0.2597 - 1s/epoch - 4ms/step
Epoch 21/100
350/350 - 1s - loss: 0.4279 - recall: 0.2831 - val_loss: 0.3814 - val_recall: 0.2781 - 1s/epoch - 4ms/step
Epoch 22/100
350/350 - 2s - loss: 0.4342 - recall: 0.2857 - val_loss: 0.3777 - val_recall: 0.3129 - 2s/epoch - 6ms/step
Epoch 23/100
350/350 - 2s - loss: 0.4359 - recall: 0.2875 - val_loss: 0.3759 - val_recall: 0.3027 - 2s/epoch - 6ms/step
Epoch 24/100
350/350 - 1s - loss: 0.4190 - recall: 0.3094 - val_loss: 0.3750 - val_recall: 0.3129 - 1s/epoch - 4ms/step
Epoch 25/100
350/350 - 1s - loss: 0.4229 - recall: 0.3225 - val_loss: 0.3730 - val_recall: 0.3211 - 1s/epoch - 4ms/step
Epoch 26/100
350/350 - 1s - loss: 0.4240 - recall: 0.3252 - val_loss: 0.3709 - val_recall: 0.3252 - 1s/epoch - 3ms/step
Epoch 27/100
350/350 - 1s - loss: 0.4246 - recall: 0.3076 - val_loss: 0.3692 - val_recall: 0.3374 - 1s/epoch - 3ms/step
Epoch 28/100
350/350 - 1s - loss: 0.4240 - recall: 0.3067 - val_loss: 0.3672 - val_recall: 0.3558 - 1s/epoch - 3ms/step
Epoch 29/100
350/350 - 1s - loss: 0.4072 - recall: 0.3138 - val_loss: 0.3681 - val_recall: 0.3395 - 1s/epoch - 3ms/step
Epoch 30/100
350/350 - 1s - loss: 0.4149 - recall: 0.3401 - val_loss: 0.3675 - val_recall: 0.3538 - 1s/epoch - 3ms/step
Epoch 31/100
350/350 - 1s - loss: 0.4180 - recall: 0.3348 - val_loss: 0.3647 - val_recall: 0.3579 - 1s/epoch - 3ms/step
Epoch 32/100
350/350 - 1s - loss: 0.4124 - recall: 0.3357 - val_loss: 0.3645 - val_recall: 0.3742 - 1s/epoch - 4ms/step
Epoch 33/100
350/350 - 2s - loss: 0.4120 - recall: 0.3278 - val_loss: 0.3644 - val_recall: 0.3579 - 2s/epoch - 6ms/step
Epoch 34/100
350/350 - 2s - loss: 0.3997 - recall: 0.3506 - val_loss: 0.3617 - val_recall: 0.3722 - 2s/epoch - 7ms/step
Epoch 35/100
350/350 - 1s - loss: 0.4075 - recall: 0.3567 - val_loss: 0.3609 - val_recall: 0.3783 - 1s/epoch - 3ms/step
Epoch 36/100
350/350 - 1s - loss: 0.4067 - recall: 0.3453 - val_loss: 0.3591 - val_recall: 0.3967 - 1s/epoch - 3ms/step
Epoch 37/100
350/350 - 1s - loss: 0.4107 - recall: 0.3427 - val_loss: 0.3603 - val_recall: 0.3906 - 1s/epoch - 3ms/step
Epoch 38/100
350/350 - 1s - loss: 0.3970 - recall: 0.3567 - val_loss: 0.3587 - val_recall: 0.3926 - 1s/epoch - 3ms/step
Epoch 39/100
350/350 - 1s - loss: 0.4068 - recall: 0.3322 - val_loss: 0.3582 - val_recall: 0.3865 - 1s/epoch - 3ms/step
Epoch 40/100
350/350 - 1s - loss: 0.4056 - recall: 0.3436 - val_loss: 0.3574 - val_recall: 0.4049 - 1s/epoch - 3ms/step
Epoch 41/100
350/350 - 1s - loss: 0.3996 - recall: 0.3541 - val_loss: 0.3553 - val_recall: 0.4294 - 1s/epoch - 3ms/step
Epoch 42/100
350/350 - 1s - loss: 0.4068 - recall: 0.3339 - val_loss: 0.3568 - val_recall: 0.4151 - 1s/epoch - 3ms/step
Epoch 43/100
350/350 - 1s - loss: 0.4031 - recall: 0.3541 - val_loss: 0.3554 - val_recall: 0.4131 - 1s/epoch - 3ms/step
Epoch 44/100
350/350 - 2s - loss: 0.3954 - recall: 0.3716 - val_loss: 0.3551 - val_recall: 0.4131 - 2s/epoch - 7ms/step
Epoch 45/100
350/350 - 2s - loss: 0.3935 - recall: 0.3707 - val_loss: 0.3539 - val_recall: 0.4254 - 2s/epoch - 6ms/step
Epoch 46/100
350/350 - 1s - loss: 0.3921 - recall: 0.3865 - val_loss: 0.3546 - val_recall: 0.4254 - 1s/epoch - 3ms/step
Epoch 47/100
350/350 - 1s - loss: 0.3918 - recall: 0.3663 - val_loss: 0.3545 - val_recall: 0.4438 - 1s/epoch - 3ms/step
Epoch 48/100
350/350 - 1s - loss: 0.3982 - recall: 0.3602 - val_loss: 0.3543 - val_recall: 0.4356 - 1s/epoch - 3ms/step
Epoch 49/100
350/350 - 1s - loss: 0.3845 - recall: 0.3856 - val_loss: 0.3545 - val_recall: 0.4233 - 1s/epoch - 3ms/step
Epoch 50/100
350/350 - 1s - loss: 0.3891 - recall: 0.3786 - val_loss: 0.3546 - val_recall: 0.4294 - 1s/epoch - 3ms/step
Epoch 51/100
350/350 - 1s - loss: 0.3959 - recall: 0.3663 - val_loss: 0.3532 - val_recall: 0.4356 - 1s/epoch - 3ms/step
Epoch 52/100
350/350 - 1s - loss: 0.3878 - recall: 0.3918 - val_loss: 0.3527 - val_recall: 0.4397 - 1s/epoch - 3ms/step
Epoch 53/100
350/350 - 1s - loss: 0.3885 - recall: 0.3795 - val_loss: 0.3537 - val_recall: 0.4213 - 1s/epoch - 3ms/step
Epoch 54/100
350/350 - 1s - loss: 0.3867 - recall: 0.3874 - val_loss: 0.3521 - val_recall: 0.4417 - 1s/epoch - 4ms/step
Epoch 55/100
350/350 - 2s - loss: 0.3814 - recall: 0.3979 - val_loss: 0.3525 - val_recall: 0.4519 - 2s/epoch - 7ms/step
Epoch 56/100
350/350 - 2s - loss: 0.3927 - recall: 0.3655 - val_loss: 0.3522 - val_recall: 0.4601 - 2s/epoch - 6ms/step
Epoch 57/100
350/350 - 1s - loss: 0.3852 - recall: 0.3988 - val_loss: 0.3522 - val_recall: 0.4356 - 1s/epoch - 3ms/step
Epoch 58/100
350/350 - 1s - loss: 0.3821 - recall: 0.3935 - val_loss: 0.3499 - val_recall: 0.4663 - 1s/epoch - 4ms/step
Epoch 59/100
350/350 - 1s - loss: 0.3870 - recall: 0.3769 - val_loss: 0.3499 - val_recall: 0.4560 - 1s/epoch - 3ms/step
Epoch 60/100
350/350 - 1s - loss: 0.3852 - recall: 0.3935 - val_loss: 0.3502 - val_recall: 0.4642 - 1s/epoch - 4ms/step
Epoch 61/100
350/350 - 1s - loss: 0.3813 - recall: 0.3830 - val_loss: 0.3499 - val_recall: 0.4560 - 1s/epoch - 4ms/step
Epoch 62/100
350/350 - 1s - loss: 0.3821 - recall: 0.4014 - val_loss: 0.3492 - val_recall: 0.4703 - 1s/epoch - 4ms/step
Epoch 63/100
350/350 - 1s - loss: 0.3875 - recall: 0.3760 - val_loss: 0.3484 - val_recall: 0.4581 - 1s/epoch - 4ms/step
Epoch 64/100
350/350 - 2s - loss: 0.3807 - recall: 0.3988 - val_loss: 0.3475 - val_recall: 0.4724 - 2s/epoch - 4ms/step
Epoch 65/100
350/350 - 2s - loss: 0.3796 - recall: 0.3944 - val_loss: 0.3475 - val_recall: 0.4785 - 2s/epoch - 6ms/step
Epoch 66/100
350/350 - 2s - loss: 0.3803 - recall: 0.4040 - val_loss: 0.3473 - val_recall: 0.4908 - 2s/epoch - 6ms/step
Epoch 67/100
350/350 - 1s - loss: 0.3770 - recall: 0.4032 - val_loss: 0.3490 - val_recall: 0.4499 - 1s/epoch - 3ms/step
Epoch 68/100
350/350 - 1s - loss: 0.3830 - recall: 0.4014 - val_loss: 0.3482 - val_recall: 0.4622 - 1s/epoch - 3ms/step
Epoch 69/100
350/350 - 1s - loss: 0.3798 - recall: 0.3944 - val_loss: 0.3480 - val_recall: 0.4744 - 1s/epoch - 4ms/step
Epoch 70/100
350/350 - 1s - loss: 0.3796 - recall: 0.4093 - val_loss: 0.3486 - val_recall: 0.4683 - 1s/epoch - 3ms/step
Epoch 71/100
350/350 - 1s - loss: 0.3744 - recall: 0.4058 - val_loss: 0.3495 - val_recall: 0.4581 - 1s/epoch - 4ms/step
Epoch 72/100
350/350 - 1s - loss: 0.3750 - recall: 0.4102 - val_loss: 0.3491 - val_recall: 0.4765 - 1s/epoch - 3ms/step
Epoch 73/100
350/350 - 1s - loss: 0.3728 - recall: 0.4189 - val_loss: 0.3488 - val_recall: 0.4847 - 1s/epoch - 3ms/step
Epoch 74/100
350/350 - 1s - loss: 0.3782 - recall: 0.4040 - val_loss: 0.3484 - val_recall: 0.4663 - 1s/epoch - 3ms/step
Epoch 75/100
350/350 - 2s - loss: 0.3724 - recall: 0.4154 - val_loss: 0.3489 - val_recall: 0.4622 - 2s/epoch - 5ms/step
Epoch 76/100
350/350 - 2s - loss: 0.3681 - recall: 0.4102 - val_loss: 0.3474 - val_recall: 0.4785 - 2s/epoch - 7ms/step
Epoch 77/100
350/350 - 1s - loss: 0.3727 - recall: 0.4198 - val_loss: 0.3466 - val_recall: 0.4806 - 1s/epoch - 4ms/step
Epoch 78/100
350/350 - 1s - loss: 0.3703 - recall: 0.4259 - val_loss: 0.3477 - val_recall: 0.4683 - 1s/epoch - 3ms/step
Epoch 79/100
350/350 - 1s - loss: 0.3738 - recall: 0.3996 - val_loss: 0.3482 - val_recall: 0.4806 - 1s/epoch - 3ms/step
Epoch 80/100
350/350 - 1s - loss: 0.3721 - recall: 0.4268 - val_loss: 0.3478 - val_recall: 0.4765 - 1s/epoch - 3ms/step
Epoch 81/100
350/350 - 1s - loss: 0.3747 - recall: 0.4321 - val_loss: 0.3477 - val_recall: 0.4888 - 1s/epoch - 3ms/step
Epoch 82/100
350/350 - 1s - loss: 0.3687 - recall: 0.4338 - val_loss: 0.3473 - val_recall: 0.4703 - 1s/epoch - 3ms/step
Epoch 83/100
350/350 - 1s - loss: 0.3710 - recall: 0.4093 - val_loss: 0.3467 - val_recall: 0.4847 - 1s/epoch - 4ms/step
Epoch 84/100
350/350 - 1s - loss: 0.3632 - recall: 0.4303 - val_loss: 0.3467 - val_recall: 0.4765 - 1s/epoch - 4ms/step
Epoch 85/100
350/350 - 1s - loss: 0.3678 - recall: 0.4207 - val_loss: 0.3470 - val_recall: 0.4928 - 1s/epoch - 3ms/step
Epoch 86/100
350/350 - 2s - loss: 0.3747 - recall: 0.4233 - val_loss: 0.3463 - val_recall: 0.4765 - 2s/epoch - 5ms/step
Epoch 87/100
350/350 - 2s - loss: 0.3629 - recall: 0.4303 - val_loss: 0.3458 - val_recall: 0.5031 - 2s/epoch - 7ms/step
Epoch 88/100
350/350 - 1s - loss: 0.3699 - recall: 0.4163 - val_loss: 0.3465 - val_recall: 0.4908 - 1s/epoch - 3ms/step
Epoch 89/100
350/350 - 1s - loss: 0.3701 - recall: 0.4189 - val_loss: 0.3461 - val_recall: 0.4888 - 1s/epoch - 3ms/step
Epoch 90/100
350/350 - 1s - loss: 0.3762 - recall: 0.4356 - val_loss: 0.3468 - val_recall: 0.4642 - 1s/epoch - 3ms/step
Epoch 91/100
350/350 - 1s - loss: 0.3678 - recall: 0.4216 - val_loss: 0.3469 - val_recall: 0.4888 - 1s/epoch - 4ms/step
Epoch 92/100
350/350 - 1s - loss: 0.3695 - recall: 0.4172 - val_loss: 0.3470 - val_recall: 0.4744 - 1s/epoch - 3ms/step
Epoch 93/100
350/350 - 1s - loss: 0.3662 - recall: 0.4356 - val_loss: 0.3477 - val_recall: 0.4847 - 1s/epoch - 3ms/step
Epoch 94/100
350/350 - 1s - loss: 0.3636 - recall: 0.4426 - val_loss: 0.3465 - val_recall: 0.4765 - 1s/epoch - 3ms/step
Epoch 95/100
350/350 - 1s - loss: 0.3653 - recall: 0.4172 - val_loss: 0.3469 - val_recall: 0.4806 - 1s/epoch - 3ms/step
Epoch 96/100
350/350 - 2s - loss: 0.3665 - recall: 0.4242 - val_loss: 0.3478 - val_recall: 0.4785 - 2s/epoch - 4ms/step
Epoch 97/100
350/350 - 2s - loss: 0.3641 - recall: 0.4224 - val_loss: 0.3472 - val_recall: 0.4724 - 2s/epoch - 6ms/step
Epoch 98/100
350/350 - 2s - loss: 0.3632 - recall: 0.4391 - val_loss: 0.3464 - val_recall: 0.4785 - 2s/epoch - 6ms/step
Epoch 99/100
350/350 - 1s - loss: 0.3707 - recall: 0.4216 - val_loss: 0.3471 - val_recall: 0.4785 - 1s/epoch - 3ms/step
Epoch 100/100
350/350 - 1s - loss: 0.3626 - recall: 0.4417 - val_loss: 0.3471 - val_recall: 0.4806 - 1s/epoch - 3ms/step
plot(results.iloc[i]['history'],'loss')
plot(results.iloc[i]['history'],'recall')
results.iloc[i]
# hidden layers 2 # neurons - hidden layer [128, 128, 64] activation function - hidden layer [relu, relu] # epochs 100 batch size 16 optimizer Adam learning rate, momentum, dropout [0.0001, 0.0, 0.2] weight initializer he_uniform regularization - train loss 0.362576 validation loss 0.347113 train recall 0.441718 validation recall 0.480573 time (secs) 142.95 model <keras.src.engine.sequential.Sequential object... history <keras.src.callbacks.History object at 0x7dfce... Name: 5, dtype: object
i+=1
model_fit_with_dropout('relu','relu','Adam',X_train,y_train,100,i,dropoutval=0.3,learning_rte=1e-4) # dropout value increased to 0.3
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 128) 1536
dropout (Dropout) (None, 128) 0
dense_1 (Dense) (None, 128) 16512
batch_normalization (Batch (None, 128) 512
Normalization)
dropout_1 (Dropout) (None, 128) 0
dense_2 (Dense) (None, 64) 8256
batch_normalization_1 (Bat (None, 64) 256
chNormalization)
dropout_2 (Dropout) (None, 64) 0
dense_3 (Dense) (None, 1) 65
=================================================================
Total params: 27137 (106.00 KB)
Trainable params: 26753 (104.50 KB)
Non-trainable params: 384 (1.50 KB)
_________________________________________________________________
Epoch 1/100
350/350 - 3s - loss: 0.8844 - recall: 0.4847 - val_loss: 0.5573 - val_recall: 0.3333 - 3s/epoch - 9ms/step
Epoch 2/100
350/350 - 1s - loss: 0.7457 - recall: 0.4400 - val_loss: 0.5008 - val_recall: 0.2904 - 1s/epoch - 3ms/step
Epoch 3/100
350/350 - 1s - loss: 0.6516 - recall: 0.4172 - val_loss: 0.4674 - val_recall: 0.2515 - 1s/epoch - 4ms/step
Epoch 4/100
350/350 - 2s - loss: 0.6028 - recall: 0.3865 - val_loss: 0.4518 - val_recall: 0.2311 - 2s/epoch - 6ms/step
Epoch 5/100
350/350 - 2s - loss: 0.5693 - recall: 0.3304 - val_loss: 0.4424 - val_recall: 0.2106 - 2s/epoch - 7ms/step
Epoch 6/100
350/350 - 1s - loss: 0.5493 - recall: 0.3094 - val_loss: 0.4351 - val_recall: 0.2025 - 1s/epoch - 4ms/step
Epoch 7/100
350/350 - 1s - loss: 0.5325 - recall: 0.2805 - val_loss: 0.4289 - val_recall: 0.1718 - 1s/epoch - 3ms/step
Epoch 8/100
350/350 - 1s - loss: 0.5189 - recall: 0.2542 - val_loss: 0.4262 - val_recall: 0.1677 - 1s/epoch - 3ms/step
Epoch 9/100
350/350 - 1s - loss: 0.5065 - recall: 0.2489 - val_loss: 0.4228 - val_recall: 0.1472 - 1s/epoch - 3ms/step
Epoch 10/100
350/350 - 1s - loss: 0.5067 - recall: 0.2279 - val_loss: 0.4209 - val_recall: 0.1452 - 1s/epoch - 3ms/step
Epoch 11/100
350/350 - 1s - loss: 0.5056 - recall: 0.2209 - val_loss: 0.4200 - val_recall: 0.1493 - 1s/epoch - 3ms/step
Epoch 12/100
350/350 - 1s - loss: 0.5011 - recall: 0.2174 - val_loss: 0.4187 - val_recall: 0.1391 - 1s/epoch - 3ms/step
Epoch 13/100
350/350 - 1s - loss: 0.4836 - recall: 0.2270 - val_loss: 0.4170 - val_recall: 0.1452 - 1s/epoch - 3ms/step
Epoch 14/100
350/350 - 1s - loss: 0.4873 - recall: 0.2209 - val_loss: 0.4147 - val_recall: 0.1472 - 1s/epoch - 4ms/step
Epoch 15/100
350/350 - 2s - loss: 0.4747 - recall: 0.2191 - val_loss: 0.4129 - val_recall: 0.1513 - 2s/epoch - 6ms/step
Epoch 16/100
350/350 - 2s - loss: 0.4758 - recall: 0.2165 - val_loss: 0.4114 - val_recall: 0.1697 - 2s/epoch - 6ms/step
Epoch 17/100
350/350 - 1s - loss: 0.4755 - recall: 0.2428 - val_loss: 0.4085 - val_recall: 0.1922 - 1s/epoch - 4ms/step
Epoch 18/100
350/350 - 1s - loss: 0.4673 - recall: 0.2349 - val_loss: 0.4085 - val_recall: 0.1902 - 1s/epoch - 3ms/step
Epoch 19/100
350/350 - 1s - loss: 0.4660 - recall: 0.2191 - val_loss: 0.4083 - val_recall: 0.2045 - 1s/epoch - 3ms/step
Epoch 20/100
350/350 - 1s - loss: 0.4644 - recall: 0.2401 - val_loss: 0.4059 - val_recall: 0.1779 - 1s/epoch - 4ms/step
Epoch 21/100
350/350 - 1s - loss: 0.4604 - recall: 0.2323 - val_loss: 0.4021 - val_recall: 0.1984 - 1s/epoch - 3ms/step
Epoch 22/100
350/350 - 1s - loss: 0.4562 - recall: 0.2287 - val_loss: 0.4015 - val_recall: 0.2209 - 1s/epoch - 3ms/step
Epoch 23/100
350/350 - 1s - loss: 0.4669 - recall: 0.2323 - val_loss: 0.4008 - val_recall: 0.1984 - 1s/epoch - 4ms/step
Epoch 24/100
350/350 - 1s - loss: 0.4542 - recall: 0.2410 - val_loss: 0.4006 - val_recall: 0.1943 - 1s/epoch - 4ms/step
Epoch 25/100
350/350 - 2s - loss: 0.4529 - recall: 0.2472 - val_loss: 0.3967 - val_recall: 0.2065 - 2s/epoch - 6ms/step
Epoch 26/100
350/350 - 2s - loss: 0.4531 - recall: 0.2524 - val_loss: 0.3958 - val_recall: 0.2168 - 2s/epoch - 6ms/step
Epoch 27/100
350/350 - 1s - loss: 0.4456 - recall: 0.2436 - val_loss: 0.3946 - val_recall: 0.2025 - 1s/epoch - 4ms/step
Epoch 28/100
350/350 - 1s - loss: 0.4546 - recall: 0.2191 - val_loss: 0.3917 - val_recall: 0.2229 - 1s/epoch - 3ms/step
Epoch 29/100
350/350 - 1s - loss: 0.4428 - recall: 0.2375 - val_loss: 0.3905 - val_recall: 0.2249 - 1s/epoch - 3ms/step
Epoch 30/100
350/350 - 1s - loss: 0.4427 - recall: 0.2498 - val_loss: 0.3896 - val_recall: 0.2372 - 1s/epoch - 3ms/step
Epoch 31/100
350/350 - 1s - loss: 0.4469 - recall: 0.2515 - val_loss: 0.3874 - val_recall: 0.2474 - 1s/epoch - 3ms/step
Epoch 32/100
350/350 - 1s - loss: 0.4428 - recall: 0.2594 - val_loss: 0.3872 - val_recall: 0.2393 - 1s/epoch - 3ms/step
Epoch 33/100
350/350 - 1s - loss: 0.4474 - recall: 0.2401 - val_loss: 0.3876 - val_recall: 0.2270 - 1s/epoch - 4ms/step
Epoch 34/100
350/350 - 1s - loss: 0.4367 - recall: 0.2472 - val_loss: 0.3852 - val_recall: 0.2434 - 1s/epoch - 3ms/step
Epoch 35/100
350/350 - 2s - loss: 0.4345 - recall: 0.2515 - val_loss: 0.3836 - val_recall: 0.2638 - 2s/epoch - 6ms/step
Epoch 36/100
350/350 - 2s - loss: 0.4399 - recall: 0.2445 - val_loss: 0.3812 - val_recall: 0.2822 - 2s/epoch - 7ms/step
Epoch 37/100
350/350 - 1s - loss: 0.4393 - recall: 0.2550 - val_loss: 0.3811 - val_recall: 0.2822 - 1s/epoch - 4ms/step
Epoch 38/100
350/350 - 1s - loss: 0.4282 - recall: 0.2568 - val_loss: 0.3797 - val_recall: 0.2781 - 1s/epoch - 4ms/step
Epoch 39/100
350/350 - 1s - loss: 0.4359 - recall: 0.2445 - val_loss: 0.3797 - val_recall: 0.2883 - 1s/epoch - 4ms/step
Epoch 40/100
350/350 - 1s - loss: 0.4301 - recall: 0.2708 - val_loss: 0.3775 - val_recall: 0.2965 - 1s/epoch - 4ms/step
Epoch 41/100
350/350 - 1s - loss: 0.4272 - recall: 0.2770 - val_loss: 0.3744 - val_recall: 0.3313 - 1s/epoch - 3ms/step
Epoch 42/100
350/350 - 1s - loss: 0.4369 - recall: 0.2436 - val_loss: 0.3753 - val_recall: 0.3129 - 1s/epoch - 3ms/step
Epoch 43/100
350/350 - 1s - loss: 0.4301 - recall: 0.2568 - val_loss: 0.3723 - val_recall: 0.3374 - 1s/epoch - 4ms/step
Epoch 44/100
350/350 - 1s - loss: 0.4162 - recall: 0.2971 - val_loss: 0.3714 - val_recall: 0.3313 - 1s/epoch - 3ms/step
Epoch 45/100
350/350 - 2s - loss: 0.4168 - recall: 0.2989 - val_loss: 0.3714 - val_recall: 0.3374 - 2s/epoch - 5ms/step
Epoch 46/100
350/350 - 2s - loss: 0.4235 - recall: 0.2927 - val_loss: 0.3701 - val_recall: 0.3476 - 2s/epoch - 7ms/step
Epoch 47/100
350/350 - 2s - loss: 0.4186 - recall: 0.2796 - val_loss: 0.3688 - val_recall: 0.3517 - 2s/epoch - 4ms/step
Epoch 48/100
350/350 - 1s - loss: 0.4256 - recall: 0.2848 - val_loss: 0.3684 - val_recall: 0.3558 - 1s/epoch - 3ms/step
Epoch 49/100
350/350 - 1s - loss: 0.4163 - recall: 0.3015 - val_loss: 0.3669 - val_recall: 0.3599 - 1s/epoch - 4ms/step
Epoch 50/100
350/350 - 1s - loss: 0.4194 - recall: 0.3059 - val_loss: 0.3660 - val_recall: 0.3620 - 1s/epoch - 4ms/step
Epoch 51/100
350/350 - 1s - loss: 0.4227 - recall: 0.2726 - val_loss: 0.3655 - val_recall: 0.3661 - 1s/epoch - 4ms/step
Epoch 52/100
350/350 - 1s - loss: 0.4169 - recall: 0.3050 - val_loss: 0.3643 - val_recall: 0.3579 - 1s/epoch - 4ms/step
Epoch 53/100
350/350 - 1s - loss: 0.4183 - recall: 0.3024 - val_loss: 0.3648 - val_recall: 0.3517 - 1s/epoch - 4ms/step
Epoch 54/100
350/350 - 1s - loss: 0.4149 - recall: 0.3067 - val_loss: 0.3641 - val_recall: 0.3517 - 1s/epoch - 3ms/step
Epoch 55/100
350/350 - 2s - loss: 0.4116 - recall: 0.3050 - val_loss: 0.3635 - val_recall: 0.3640 - 2s/epoch - 6ms/step
Epoch 56/100
350/350 - 2s - loss: 0.4172 - recall: 0.3024 - val_loss: 0.3627 - val_recall: 0.3640 - 2s/epoch - 7ms/step
Epoch 57/100
350/350 - 1s - loss: 0.4085 - recall: 0.3190 - val_loss: 0.3629 - val_recall: 0.3620 - 1s/epoch - 4ms/step
Epoch 58/100
350/350 - 1s - loss: 0.4068 - recall: 0.3155 - val_loss: 0.3604 - val_recall: 0.3845 - 1s/epoch - 3ms/step
Epoch 59/100
350/350 - 1s - loss: 0.4138 - recall: 0.3111 - val_loss: 0.3605 - val_recall: 0.3681 - 1s/epoch - 4ms/step
Epoch 60/100
350/350 - 1s - loss: 0.4034 - recall: 0.3322 - val_loss: 0.3601 - val_recall: 0.3906 - 1s/epoch - 4ms/step
Epoch 61/100
350/350 - 1s - loss: 0.4092 - recall: 0.3155 - val_loss: 0.3591 - val_recall: 0.3804 - 1s/epoch - 4ms/step
Epoch 62/100
350/350 - 1s - loss: 0.4077 - recall: 0.3427 - val_loss: 0.3586 - val_recall: 0.3865 - 1s/epoch - 4ms/step
Epoch 63/100
350/350 - 1s - loss: 0.4069 - recall: 0.3295 - val_loss: 0.3575 - val_recall: 0.3885 - 1s/epoch - 4ms/step
Epoch 64/100
350/350 - 1s - loss: 0.4015 - recall: 0.3216 - val_loss: 0.3570 - val_recall: 0.3906 - 1s/epoch - 3ms/step
Epoch 65/100
350/350 - 2s - loss: 0.4051 - recall: 0.3216 - val_loss: 0.3565 - val_recall: 0.4008 - 2s/epoch - 5ms/step
Epoch 66/100
350/350 - 2s - loss: 0.4081 - recall: 0.3164 - val_loss: 0.3556 - val_recall: 0.4049 - 2s/epoch - 6ms/step
Epoch 67/100
350/350 - 2s - loss: 0.4040 - recall: 0.3295 - val_loss: 0.3562 - val_recall: 0.3967 - 2s/epoch - 5ms/step
Epoch 68/100
350/350 - 1s - loss: 0.4050 - recall: 0.3252 - val_loss: 0.3554 - val_recall: 0.4090 - 1s/epoch - 3ms/step
Epoch 69/100
350/350 - 1s - loss: 0.4042 - recall: 0.3190 - val_loss: 0.3549 - val_recall: 0.4151 - 1s/epoch - 3ms/step
Epoch 70/100
350/350 - 1s - loss: 0.4033 - recall: 0.3444 - val_loss: 0.3551 - val_recall: 0.4192 - 1s/epoch - 3ms/step
Epoch 71/100
350/350 - 1s - loss: 0.4004 - recall: 0.3471 - val_loss: 0.3558 - val_recall: 0.4110 - 1s/epoch - 3ms/step
Epoch 72/100
350/350 - 1s - loss: 0.3974 - recall: 0.3550 - val_loss: 0.3548 - val_recall: 0.4254 - 1s/epoch - 3ms/step
Epoch 73/100
350/350 - 1s - loss: 0.3950 - recall: 0.3479 - val_loss: 0.3546 - val_recall: 0.4315 - 1s/epoch - 3ms/step
Epoch 74/100
350/350 - 1s - loss: 0.4045 - recall: 0.3427 - val_loss: 0.3545 - val_recall: 0.4192 - 1s/epoch - 3ms/step
Epoch 75/100
350/350 - 1s - loss: 0.3984 - recall: 0.3444 - val_loss: 0.3547 - val_recall: 0.4192 - 1s/epoch - 3ms/step
Epoch 76/100
350/350 - 2s - loss: 0.3902 - recall: 0.3506 - val_loss: 0.3531 - val_recall: 0.4315 - 2s/epoch - 6ms/step
Epoch 77/100
350/350 - 2s - loss: 0.3988 - recall: 0.3532 - val_loss: 0.3522 - val_recall: 0.4274 - 2s/epoch - 6ms/step
Epoch 78/100
350/350 - 1s - loss: 0.3933 - recall: 0.3453 - val_loss: 0.3527 - val_recall: 0.4294 - 1s/epoch - 4ms/step
Epoch 79/100
350/350 - 1s - loss: 0.3972 - recall: 0.3567 - val_loss: 0.3535 - val_recall: 0.4274 - 1s/epoch - 3ms/step
Epoch 80/100
350/350 - 1s - loss: 0.3981 - recall: 0.3506 - val_loss: 0.3529 - val_recall: 0.4172 - 1s/epoch - 4ms/step
Epoch 81/100
350/350 - 1s - loss: 0.3960 - recall: 0.3769 - val_loss: 0.3528 - val_recall: 0.4376 - 1s/epoch - 3ms/step
Epoch 82/100
350/350 - 1s - loss: 0.3948 - recall: 0.3383 - val_loss: 0.3524 - val_recall: 0.4213 - 1s/epoch - 4ms/step
Epoch 83/100
350/350 - 1s - loss: 0.3936 - recall: 0.3523 - val_loss: 0.3529 - val_recall: 0.4274 - 1s/epoch - 3ms/step
Epoch 84/100
350/350 - 1s - loss: 0.3876 - recall: 0.3646 - val_loss: 0.3529 - val_recall: 0.4151 - 1s/epoch - 3ms/step
Epoch 85/100
350/350 - 1s - loss: 0.3900 - recall: 0.3479 - val_loss: 0.3517 - val_recall: 0.4438 - 1s/epoch - 4ms/step
Epoch 86/100
350/350 - 2s - loss: 0.3931 - recall: 0.3646 - val_loss: 0.3514 - val_recall: 0.4254 - 2s/epoch - 5ms/step
Epoch 87/100
350/350 - 2s - loss: 0.3872 - recall: 0.3734 - val_loss: 0.3511 - val_recall: 0.4499 - 2s/epoch - 6ms/step
Epoch 88/100
350/350 - 1s - loss: 0.3941 - recall: 0.3567 - val_loss: 0.3509 - val_recall: 0.4376 - 1s/epoch - 4ms/step
Epoch 89/100
350/350 - 1s - loss: 0.3980 - recall: 0.3497 - val_loss: 0.3509 - val_recall: 0.4417 - 1s/epoch - 4ms/step
Epoch 90/100
350/350 - 1s - loss: 0.3977 - recall: 0.3611 - val_loss: 0.3509 - val_recall: 0.4070 - 1s/epoch - 3ms/step
Epoch 91/100
350/350 - 1s - loss: 0.3917 - recall: 0.3690 - val_loss: 0.3500 - val_recall: 0.4540 - 1s/epoch - 4ms/step
Epoch 92/100
350/350 - 1s - loss: 0.3961 - recall: 0.3532 - val_loss: 0.3503 - val_recall: 0.4192 - 1s/epoch - 3ms/step
Epoch 93/100
350/350 - 1s - loss: 0.3884 - recall: 0.3716 - val_loss: 0.3503 - val_recall: 0.4397 - 1s/epoch - 3ms/step
Epoch 94/100
350/350 - 1s - loss: 0.3877 - recall: 0.3900 - val_loss: 0.3502 - val_recall: 0.4213 - 1s/epoch - 4ms/step
Epoch 95/100
350/350 - 1s - loss: 0.3882 - recall: 0.3699 - val_loss: 0.3499 - val_recall: 0.4479 - 1s/epoch - 4ms/step
Epoch 96/100
350/350 - 2s - loss: 0.3851 - recall: 0.3672 - val_loss: 0.3497 - val_recall: 0.4335 - 2s/epoch - 5ms/step
Epoch 97/100
350/350 - 2s - loss: 0.3905 - recall: 0.3707 - val_loss: 0.3494 - val_recall: 0.4376 - 2s/epoch - 5ms/step
Epoch 98/100
350/350 - 2s - loss: 0.3863 - recall: 0.3909 - val_loss: 0.3488 - val_recall: 0.4499 - 2s/epoch - 6ms/step
Epoch 99/100
350/350 - 1s - loss: 0.3942 - recall: 0.3734 - val_loss: 0.3490 - val_recall: 0.4479 - 1s/epoch - 3ms/step
Epoch 100/100
350/350 - 1s - loss: 0.3845 - recall: 0.3734 - val_loss: 0.3491 - val_recall: 0.4376 - 1s/epoch - 3ms/step
plot(results.iloc[i]['history'],'loss')
plot(results.iloc[i]['history'],'recall')
results.iloc[i]
# hidden layers 2 # neurons - hidden layer [128, 128, 64] activation function - hidden layer [relu, relu] # epochs 100 batch size 16 optimizer Adam learning rate, momentum, dropout [0.0001, 0.0, 0.3] weight initializer he_uniform regularization - train loss 0.384523 validation loss 0.349065 train recall 0.373357 validation recall 0.437628 time (secs) 143.73 model <keras.src.engine.sequential.Sequential object... history <keras.src.callbacks.History object at 0x7dfce... Name: 6, dtype: object
Neural Network with dropout ratio of 0.2 was performing better than the Neural Network with dropout ration 0.3. Still the Recall scores didnt improve significantly. Next step is to implement Oversampling technique SMOTE to see scores improvements
# To oversample the data
from imblearn.over_sampling import SMOTE
# Synthetic Minority Over Sampling Technique
sm = SMOTE(sampling_strategy=1, k_neighbors=5, random_state=1)
X_train_over, y_train_over = sm.fit_resample(X_train, y_train)
i+=1
model_fit('relu','relu','SGD',X_train_over,y_train_over,100,i)
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 128) 1536
dense_1 (Dense) (None, 128) 16512
batch_normalization (Batch (None, 128) 512
Normalization)
dense_2 (Dense) (None, 64) 8256
batch_normalization_1 (Bat (None, 64) 256
chNormalization)
dense_3 (Dense) (None, 1) 65
=================================================================
Total params: 27137 (106.00 KB)
Trainable params: 26753 (104.50 KB)
Non-trainable params: 384 (1.50 KB)
_________________________________________________________________
Epoch 1/100
558/558 - 3s - loss: 0.7649 - recall: 0.5860 - val_loss: 0.6455 - val_recall: 0.6217 - 3s/epoch - 5ms/step
Epoch 2/100
558/558 - 1s - loss: 0.6047 - recall: 0.6811 - val_loss: 0.6093 - val_recall: 0.6564 - 1s/epoch - 3ms/step
Epoch 3/100
558/558 - 2s - loss: 0.5653 - recall: 0.7217 - val_loss: 0.5809 - val_recall: 0.6605 - 2s/epoch - 4ms/step
Epoch 4/100
558/558 - 3s - loss: 0.5504 - recall: 0.7300 - val_loss: 0.5743 - val_recall: 0.6667 - 3s/epoch - 4ms/step
Epoch 5/100
558/558 - 2s - loss: 0.5411 - recall: 0.7320 - val_loss: 0.5764 - val_recall: 0.6871 - 2s/epoch - 3ms/step
Epoch 6/100
558/558 - 1s - loss: 0.5305 - recall: 0.7437 - val_loss: 0.5557 - val_recall: 0.6830 - 1s/epoch - 3ms/step
Epoch 7/100
558/558 - 1s - loss: 0.5200 - recall: 0.7556 - val_loss: 0.5391 - val_recall: 0.6667 - 1s/epoch - 3ms/step
Epoch 8/100
558/558 - 2s - loss: 0.5124 - recall: 0.7587 - val_loss: 0.5365 - val_recall: 0.6769 - 2s/epoch - 3ms/step
Epoch 9/100
558/558 - 1s - loss: 0.5012 - recall: 0.7706 - val_loss: 0.5317 - val_recall: 0.6769 - 1s/epoch - 2ms/step
Epoch 10/100
558/558 - 1s - loss: 0.4956 - recall: 0.7661 - val_loss: 0.5329 - val_recall: 0.6851 - 1s/epoch - 3ms/step
Epoch 11/100
558/558 - 1s - loss: 0.4940 - recall: 0.7681 - val_loss: 0.5286 - val_recall: 0.6769 - 1s/epoch - 3ms/step
Epoch 12/100
558/558 - 2s - loss: 0.4874 - recall: 0.7751 - val_loss: 0.5148 - val_recall: 0.6687 - 2s/epoch - 4ms/step
Epoch 13/100
558/558 - 3s - loss: 0.4771 - recall: 0.7881 - val_loss: 0.5094 - val_recall: 0.6708 - 3s/epoch - 5ms/step
Epoch 14/100
558/558 - 1s - loss: 0.4720 - recall: 0.7881 - val_loss: 0.5215 - val_recall: 0.6851 - 1s/epoch - 3ms/step
Epoch 15/100
558/558 - 1s - loss: 0.4671 - recall: 0.7881 - val_loss: 0.5159 - val_recall: 0.6830 - 1s/epoch - 3ms/step
Epoch 16/100
558/558 - 1s - loss: 0.4650 - recall: 0.7919 - val_loss: 0.5166 - val_recall: 0.6892 - 1s/epoch - 3ms/step
Epoch 17/100
558/558 - 2s - loss: 0.4589 - recall: 0.7856 - val_loss: 0.5163 - val_recall: 0.6994 - 2s/epoch - 3ms/step
Epoch 18/100
558/558 - 1s - loss: 0.4593 - recall: 0.7838 - val_loss: 0.5092 - val_recall: 0.6933 - 1s/epoch - 3ms/step
Epoch 19/100
558/558 - 1s - loss: 0.4530 - recall: 0.7905 - val_loss: 0.4953 - val_recall: 0.6708 - 1s/epoch - 3ms/step
Epoch 20/100
558/558 - 2s - loss: 0.4453 - recall: 0.7937 - val_loss: 0.4978 - val_recall: 0.6830 - 2s/epoch - 3ms/step
Epoch 21/100
558/558 - 3s - loss: 0.4443 - recall: 0.8038 - val_loss: 0.5003 - val_recall: 0.6810 - 3s/epoch - 5ms/step
Epoch 22/100
558/558 - 2s - loss: 0.4460 - recall: 0.8024 - val_loss: 0.4978 - val_recall: 0.6789 - 2s/epoch - 4ms/step
Epoch 23/100
558/558 - 1s - loss: 0.4401 - recall: 0.7952 - val_loss: 0.5039 - val_recall: 0.6830 - 1s/epoch - 2ms/step
Epoch 24/100
558/558 - 1s - loss: 0.4355 - recall: 0.8127 - val_loss: 0.4962 - val_recall: 0.6748 - 1s/epoch - 3ms/step
Epoch 25/100
558/558 - 2s - loss: 0.4346 - recall: 0.8013 - val_loss: 0.4861 - val_recall: 0.6789 - 2s/epoch - 3ms/step
Epoch 26/100
558/558 - 2s - loss: 0.4292 - recall: 0.8069 - val_loss: 0.4829 - val_recall: 0.6687 - 2s/epoch - 3ms/step
Epoch 27/100
558/558 - 1s - loss: 0.4270 - recall: 0.8157 - val_loss: 0.4897 - val_recall: 0.6789 - 1s/epoch - 2ms/step
Epoch 28/100
558/558 - 1s - loss: 0.4246 - recall: 0.8062 - val_loss: 0.4962 - val_recall: 0.6748 - 1s/epoch - 3ms/step
Epoch 29/100
558/558 - 2s - loss: 0.4197 - recall: 0.8132 - val_loss: 0.4855 - val_recall: 0.6748 - 2s/epoch - 3ms/step
Epoch 30/100
558/558 - 3s - loss: 0.4236 - recall: 0.8181 - val_loss: 0.4824 - val_recall: 0.6646 - 3s/epoch - 5ms/step
Epoch 31/100
558/558 - 2s - loss: 0.4186 - recall: 0.8154 - val_loss: 0.4826 - val_recall: 0.6708 - 2s/epoch - 3ms/step
Epoch 32/100
558/558 - 1s - loss: 0.4065 - recall: 0.8199 - val_loss: 0.4879 - val_recall: 0.6728 - 1s/epoch - 3ms/step
Epoch 33/100
558/558 - 2s - loss: 0.4098 - recall: 0.8174 - val_loss: 0.4875 - val_recall: 0.6708 - 2s/epoch - 3ms/step
Epoch 34/100
558/558 - 1s - loss: 0.4050 - recall: 0.8282 - val_loss: 0.4796 - val_recall: 0.6728 - 1s/epoch - 3ms/step
Epoch 35/100
558/558 - 1s - loss: 0.4117 - recall: 0.8242 - val_loss: 0.4835 - val_recall: 0.6708 - 1s/epoch - 3ms/step
Epoch 36/100
558/558 - 2s - loss: 0.4103 - recall: 0.8174 - val_loss: 0.4967 - val_recall: 0.6933 - 2s/epoch - 3ms/step
Epoch 37/100
558/558 - 2s - loss: 0.4045 - recall: 0.8224 - val_loss: 0.4910 - val_recall: 0.6769 - 2s/epoch - 3ms/step
Epoch 38/100
558/558 - 3s - loss: 0.4006 - recall: 0.8233 - val_loss: 0.4819 - val_recall: 0.6687 - 3s/epoch - 5ms/step
Epoch 39/100
558/558 - 2s - loss: 0.3929 - recall: 0.8257 - val_loss: 0.4806 - val_recall: 0.6646 - 2s/epoch - 4ms/step
Epoch 40/100
558/558 - 2s - loss: 0.3960 - recall: 0.8246 - val_loss: 0.4761 - val_recall: 0.6646 - 2s/epoch - 3ms/step
Epoch 41/100
558/558 - 1s - loss: 0.3972 - recall: 0.8314 - val_loss: 0.4772 - val_recall: 0.6626 - 1s/epoch - 3ms/step
Epoch 42/100
558/558 - 2s - loss: 0.3886 - recall: 0.8320 - val_loss: 0.4933 - val_recall: 0.6830 - 2s/epoch - 3ms/step
Epoch 43/100
558/558 - 2s - loss: 0.3958 - recall: 0.8231 - val_loss: 0.4914 - val_recall: 0.6789 - 2s/epoch - 3ms/step
Epoch 44/100
558/558 - 1s - loss: 0.3847 - recall: 0.8293 - val_loss: 0.4778 - val_recall: 0.6544 - 1s/epoch - 3ms/step
Epoch 45/100
558/558 - 2s - loss: 0.3905 - recall: 0.8329 - val_loss: 0.4897 - val_recall: 0.6728 - 2s/epoch - 3ms/step
Epoch 46/100
558/558 - 3s - loss: 0.3834 - recall: 0.8273 - val_loss: 0.4937 - val_recall: 0.6687 - 3s/epoch - 5ms/step
Epoch 47/100
558/558 - 3s - loss: 0.3815 - recall: 0.8280 - val_loss: 0.4829 - val_recall: 0.6564 - 3s/epoch - 6ms/step
Epoch 48/100
558/558 - 2s - loss: 0.3874 - recall: 0.8269 - val_loss: 0.4919 - val_recall: 0.6687 - 2s/epoch - 3ms/step
Epoch 49/100
558/558 - 2s - loss: 0.3773 - recall: 0.8338 - val_loss: 0.4829 - val_recall: 0.6626 - 2s/epoch - 3ms/step
Epoch 50/100
558/558 - 2s - loss: 0.3750 - recall: 0.8383 - val_loss: 0.4820 - val_recall: 0.6605 - 2s/epoch - 3ms/step
Epoch 51/100
558/558 - 2s - loss: 0.3733 - recall: 0.8376 - val_loss: 0.4929 - val_recall: 0.6708 - 2s/epoch - 4ms/step
Epoch 52/100
558/558 - 2s - loss: 0.3747 - recall: 0.8365 - val_loss: 0.4934 - val_recall: 0.6605 - 2s/epoch - 3ms/step
Epoch 53/100
558/558 - 2s - loss: 0.3752 - recall: 0.8367 - val_loss: 0.4858 - val_recall: 0.6605 - 2s/epoch - 3ms/step
Epoch 54/100
558/558 - 3s - loss: 0.3674 - recall: 0.8459 - val_loss: 0.4817 - val_recall: 0.6524 - 3s/epoch - 5ms/step
Epoch 55/100
558/558 - 2s - loss: 0.3693 - recall: 0.8408 - val_loss: 0.4798 - val_recall: 0.6483 - 2s/epoch - 3ms/step
Epoch 56/100
558/558 - 2s - loss: 0.3673 - recall: 0.8471 - val_loss: 0.4901 - val_recall: 0.6544 - 2s/epoch - 3ms/step
Epoch 57/100
558/558 - 1s - loss: 0.3633 - recall: 0.8446 - val_loss: 0.4854 - val_recall: 0.6503 - 1s/epoch - 3ms/step
Epoch 58/100
558/558 - 2s - loss: 0.3662 - recall: 0.8459 - val_loss: 0.4807 - val_recall: 0.6442 - 2s/epoch - 3ms/step
Epoch 59/100
558/558 - 2s - loss: 0.3662 - recall: 0.8426 - val_loss: 0.4907 - val_recall: 0.6503 - 2s/epoch - 3ms/step
Epoch 60/100
558/558 - 1s - loss: 0.3590 - recall: 0.8509 - val_loss: 0.4881 - val_recall: 0.6442 - 1s/epoch - 2ms/step
Epoch 61/100
558/558 - 1s - loss: 0.3570 - recall: 0.8468 - val_loss: 0.4896 - val_recall: 0.6564 - 1s/epoch - 3ms/step
Epoch 62/100
558/558 - 3s - loss: 0.3587 - recall: 0.8448 - val_loss: 0.4977 - val_recall: 0.6585 - 3s/epoch - 5ms/step
Epoch 63/100
558/558 - 2s - loss: 0.3530 - recall: 0.8511 - val_loss: 0.5014 - val_recall: 0.6667 - 2s/epoch - 4ms/step
Epoch 64/100
558/558 - 1s - loss: 0.3586 - recall: 0.8488 - val_loss: 0.4966 - val_recall: 0.6524 - 1s/epoch - 3ms/step
Epoch 65/100
558/558 - 2s - loss: 0.3587 - recall: 0.8471 - val_loss: 0.5020 - val_recall: 0.6585 - 2s/epoch - 3ms/step
Epoch 66/100
558/558 - 1s - loss: 0.3542 - recall: 0.8506 - val_loss: 0.4949 - val_recall: 0.6544 - 1s/epoch - 3ms/step
Epoch 67/100
558/558 - 2s - loss: 0.3497 - recall: 0.8486 - val_loss: 0.4907 - val_recall: 0.6462 - 2s/epoch - 3ms/step
Epoch 68/100
558/558 - 2s - loss: 0.3537 - recall: 0.8493 - val_loss: 0.4847 - val_recall: 0.6339 - 2s/epoch - 3ms/step
Epoch 69/100
558/558 - 1s - loss: 0.3498 - recall: 0.8491 - val_loss: 0.4886 - val_recall: 0.6401 - 1s/epoch - 3ms/step
Epoch 70/100
558/558 - 3s - loss: 0.3478 - recall: 0.8520 - val_loss: 0.4966 - val_recall: 0.6462 - 3s/epoch - 5ms/step
Epoch 71/100
558/558 - 3s - loss: 0.3450 - recall: 0.8524 - val_loss: 0.5070 - val_recall: 0.6544 - 3s/epoch - 5ms/step
Epoch 72/100
558/558 - 1s - loss: 0.3430 - recall: 0.8497 - val_loss: 0.4892 - val_recall: 0.6319 - 1s/epoch - 2ms/step
Epoch 73/100
558/558 - 1s - loss: 0.3409 - recall: 0.8553 - val_loss: 0.4941 - val_recall: 0.6462 - 1s/epoch - 2ms/step
Epoch 74/100
558/558 - 1s - loss: 0.3431 - recall: 0.8529 - val_loss: 0.4936 - val_recall: 0.6442 - 1s/epoch - 2ms/step
Epoch 75/100
558/558 - 1s - loss: 0.3417 - recall: 0.8531 - val_loss: 0.4827 - val_recall: 0.6278 - 1s/epoch - 3ms/step
Epoch 76/100
558/558 - 1s - loss: 0.3387 - recall: 0.8549 - val_loss: 0.5064 - val_recall: 0.6483 - 1s/epoch - 2ms/step
Epoch 77/100
558/558 - 1s - loss: 0.3342 - recall: 0.8652 - val_loss: 0.4924 - val_recall: 0.6401 - 1s/epoch - 3ms/step
Epoch 78/100
558/558 - 1s - loss: 0.3384 - recall: 0.8607 - val_loss: 0.4923 - val_recall: 0.6319 - 1s/epoch - 2ms/step
Epoch 79/100
558/558 - 2s - loss: 0.3329 - recall: 0.8596 - val_loss: 0.5019 - val_recall: 0.6503 - 2s/epoch - 4ms/step
Epoch 80/100
558/558 - 3s - loss: 0.3407 - recall: 0.8549 - val_loss: 0.4963 - val_recall: 0.6360 - 3s/epoch - 5ms/step
Epoch 81/100
558/558 - 2s - loss: 0.3390 - recall: 0.8542 - val_loss: 0.4985 - val_recall: 0.6360 - 2s/epoch - 3ms/step
Epoch 82/100
558/558 - 1s - loss: 0.3369 - recall: 0.8592 - val_loss: 0.5089 - val_recall: 0.6564 - 1s/epoch - 3ms/step
Epoch 83/100
558/558 - 1s - loss: 0.3352 - recall: 0.8594 - val_loss: 0.4990 - val_recall: 0.6319 - 1s/epoch - 3ms/step
Epoch 84/100
558/558 - 2s - loss: 0.3317 - recall: 0.8648 - val_loss: 0.5031 - val_recall: 0.6421 - 2s/epoch - 3ms/step
Epoch 85/100
558/558 - 1s - loss: 0.3297 - recall: 0.8574 - val_loss: 0.5106 - val_recall: 0.6483 - 1s/epoch - 3ms/step
Epoch 86/100
558/558 - 2s - loss: 0.3224 - recall: 0.8717 - val_loss: 0.5024 - val_recall: 0.6339 - 2s/epoch - 3ms/step
Epoch 87/100
558/558 - 2s - loss: 0.3197 - recall: 0.8657 - val_loss: 0.5047 - val_recall: 0.6503 - 2s/epoch - 4ms/step
Epoch 88/100
558/558 - 3s - loss: 0.3283 - recall: 0.8614 - val_loss: 0.5036 - val_recall: 0.6442 - 3s/epoch - 5ms/step
Epoch 89/100
558/558 - 2s - loss: 0.3215 - recall: 0.8677 - val_loss: 0.5053 - val_recall: 0.6483 - 2s/epoch - 4ms/step
Epoch 90/100
558/558 - 2s - loss: 0.3249 - recall: 0.8666 - val_loss: 0.5208 - val_recall: 0.6544 - 2s/epoch - 3ms/step
Epoch 91/100
558/558 - 1s - loss: 0.3206 - recall: 0.8659 - val_loss: 0.5123 - val_recall: 0.6421 - 1s/epoch - 3ms/step
Epoch 92/100
558/558 - 1s - loss: 0.3187 - recall: 0.8652 - val_loss: 0.5089 - val_recall: 0.6380 - 1s/epoch - 3ms/step
Epoch 93/100
558/558 - 1s - loss: 0.3201 - recall: 0.8693 - val_loss: 0.5078 - val_recall: 0.6360 - 1s/epoch - 2ms/step
Epoch 94/100
558/558 - 1s - loss: 0.3105 - recall: 0.8704 - val_loss: 0.5221 - val_recall: 0.6483 - 1s/epoch - 3ms/step
Epoch 95/100
558/558 - 1s - loss: 0.3134 - recall: 0.8670 - val_loss: 0.5220 - val_recall: 0.6585 - 1s/epoch - 3ms/step
Epoch 96/100
558/558 - 2s - loss: 0.3129 - recall: 0.8731 - val_loss: 0.5129 - val_recall: 0.6401 - 2s/epoch - 4ms/step
Epoch 97/100
558/558 - 3s - loss: 0.3088 - recall: 0.8733 - val_loss: 0.5084 - val_recall: 0.6339 - 3s/epoch - 5ms/step
Epoch 98/100
558/558 - 2s - loss: 0.3128 - recall: 0.8722 - val_loss: 0.5175 - val_recall: 0.6483 - 2s/epoch - 3ms/step
Epoch 99/100
558/558 - 1s - loss: 0.3074 - recall: 0.8760 - val_loss: 0.5291 - val_recall: 0.6524 - 1s/epoch - 3ms/step
Epoch 100/100
558/558 - 1s - loss: 0.3073 - recall: 0.8722 - val_loss: 0.5376 - val_recall: 0.6421 - 1s/epoch - 3ms/step
plot(results.iloc[i]['history'],'loss')
plot(results.iloc[i]['history'],'recall')
results.iloc[i]
# hidden layers 2 # neurons - hidden layer [128, 128, 64] activation function - hidden layer [relu, relu] # epochs 100 batch size 16 optimizer SGD learning rate, momentum, dropout [0.001, 0.0, 0] weight initializer he_uniform regularization - train loss 0.30725 validation loss 0.537618 train recall 0.872169 validation recall 0.642127 time (secs) 177.0 model <keras.src.engine.sequential.Sequential object... history <keras.src.callbacks.History object at 0x7dfce... Name: 7, dtype: object
Trained Neural network is behaving very differently for the validation data, the loss shows a different path and not following the path created by Training data.
train recall 0.872169 & validation recall 0.642127
The recall score for training data is very high where as the validation scores are less showing a lot of difference. This potentially means the model overfit the data. So not a good model.
i+=1
model_fit('relu','relu','SGD-Mom',X_train_over,y_train_over,100,i,learning_rte=1e-5,momentumval=0.9)
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 128) 1536
dense_1 (Dense) (None, 128) 16512
batch_normalization (Batch (None, 128) 512
Normalization)
dense_2 (Dense) (None, 64) 8256
batch_normalization_1 (Bat (None, 64) 256
chNormalization)
dense_3 (Dense) (None, 1) 65
=================================================================
Total params: 27137 (106.00 KB)
Trainable params: 26753 (104.50 KB)
Non-trainable params: 384 (1.50 KB)
_________________________________________________________________
Epoch 1/100
558/558 - 3s - loss: 1.0112 - recall: 0.4806 - val_loss: 0.9935 - val_recall: 0.4928 - 3s/epoch - 5ms/step
Epoch 2/100
558/558 - 2s - loss: 0.8995 - recall: 0.5225 - val_loss: 0.9077 - val_recall: 0.5481 - 2s/epoch - 3ms/step
Epoch 3/100
558/558 - 3s - loss: 0.8214 - recall: 0.5600 - val_loss: 0.8380 - val_recall: 0.5501 - 3s/epoch - 5ms/step
Epoch 4/100
558/558 - 2s - loss: 0.7691 - recall: 0.5882 - val_loss: 0.8062 - val_recall: 0.5685 - 2s/epoch - 4ms/step
Epoch 5/100
558/558 - 1s - loss: 0.7353 - recall: 0.6015 - val_loss: 0.7703 - val_recall: 0.6012 - 1s/epoch - 2ms/step
Epoch 6/100
558/558 - 2s - loss: 0.7087 - recall: 0.6100 - val_loss: 0.7350 - val_recall: 0.6074 - 2s/epoch - 3ms/step
Epoch 7/100
558/558 - 1s - loss: 0.6881 - recall: 0.6313 - val_loss: 0.7027 - val_recall: 0.6176 - 1s/epoch - 3ms/step
Epoch 8/100
558/558 - 1s - loss: 0.6685 - recall: 0.6356 - val_loss: 0.6871 - val_recall: 0.6135 - 1s/epoch - 3ms/step
Epoch 9/100
558/558 - 1s - loss: 0.6532 - recall: 0.6425 - val_loss: 0.6776 - val_recall: 0.6237 - 1s/epoch - 3ms/step
Epoch 10/100
558/558 - 1s - loss: 0.6405 - recall: 0.6562 - val_loss: 0.6644 - val_recall: 0.6217 - 1s/epoch - 3ms/step
Epoch 11/100
558/558 - 2s - loss: 0.6365 - recall: 0.6535 - val_loss: 0.6583 - val_recall: 0.6319 - 2s/epoch - 4ms/step
Epoch 12/100
558/558 - 3s - loss: 0.6303 - recall: 0.6670 - val_loss: 0.6433 - val_recall: 0.6196 - 3s/epoch - 5ms/step
Epoch 13/100
558/558 - 2s - loss: 0.6152 - recall: 0.6735 - val_loss: 0.6333 - val_recall: 0.6217 - 2s/epoch - 3ms/step
Epoch 14/100
558/558 - 1s - loss: 0.6086 - recall: 0.6797 - val_loss: 0.6451 - val_recall: 0.6442 - 1s/epoch - 3ms/step
Epoch 15/100
558/558 - 1s - loss: 0.6063 - recall: 0.6806 - val_loss: 0.6323 - val_recall: 0.6462 - 1s/epoch - 3ms/step
Epoch 16/100
558/558 - 1s - loss: 0.5998 - recall: 0.6867 - val_loss: 0.6349 - val_recall: 0.6605 - 1s/epoch - 3ms/step
Epoch 17/100
558/558 - 2s - loss: 0.5916 - recall: 0.6941 - val_loss: 0.6294 - val_recall: 0.6544 - 2s/epoch - 3ms/step
Epoch 18/100
558/558 - 2s - loss: 0.5958 - recall: 0.6894 - val_loss: 0.6207 - val_recall: 0.6544 - 2s/epoch - 3ms/step
Epoch 19/100
558/558 - 2s - loss: 0.5886 - recall: 0.6925 - val_loss: 0.6119 - val_recall: 0.6585 - 2s/epoch - 4ms/step
Epoch 20/100
558/558 - 3s - loss: 0.5808 - recall: 0.7015 - val_loss: 0.6086 - val_recall: 0.6544 - 3s/epoch - 5ms/step
Epoch 21/100
558/558 - 2s - loss: 0.5834 - recall: 0.7044 - val_loss: 0.6111 - val_recall: 0.6626 - 2s/epoch - 3ms/step
Epoch 22/100
558/558 - 2s - loss: 0.5776 - recall: 0.7136 - val_loss: 0.6113 - val_recall: 0.6728 - 2s/epoch - 3ms/step
Epoch 23/100
558/558 - 2s - loss: 0.5761 - recall: 0.7091 - val_loss: 0.6117 - val_recall: 0.6708 - 2s/epoch - 3ms/step
Epoch 24/100
558/558 - 2s - loss: 0.5747 - recall: 0.7114 - val_loss: 0.5949 - val_recall: 0.6585 - 2s/epoch - 3ms/step
Epoch 25/100
558/558 - 2s - loss: 0.5716 - recall: 0.7118 - val_loss: 0.5937 - val_recall: 0.6626 - 2s/epoch - 3ms/step
Epoch 26/100
558/558 - 1s - loss: 0.5689 - recall: 0.7172 - val_loss: 0.5889 - val_recall: 0.6544 - 1s/epoch - 3ms/step
Epoch 27/100
558/558 - 2s - loss: 0.5654 - recall: 0.7188 - val_loss: 0.5944 - val_recall: 0.6646 - 2s/epoch - 4ms/step
Epoch 28/100
558/558 - 3s - loss: 0.5648 - recall: 0.7156 - val_loss: 0.5946 - val_recall: 0.6708 - 3s/epoch - 5ms/step
Epoch 29/100
558/558 - 2s - loss: 0.5586 - recall: 0.7190 - val_loss: 0.5875 - val_recall: 0.6667 - 2s/epoch - 4ms/step
Epoch 30/100
558/558 - 2s - loss: 0.5632 - recall: 0.7212 - val_loss: 0.5867 - val_recall: 0.6667 - 2s/epoch - 3ms/step
Epoch 31/100
558/558 - 2s - loss: 0.5632 - recall: 0.7235 - val_loss: 0.5830 - val_recall: 0.6646 - 2s/epoch - 3ms/step
Epoch 32/100
558/558 - 2s - loss: 0.5535 - recall: 0.7311 - val_loss: 0.5880 - val_recall: 0.6708 - 2s/epoch - 3ms/step
Epoch 33/100
558/558 - 2s - loss: 0.5532 - recall: 0.7259 - val_loss: 0.5805 - val_recall: 0.6708 - 2s/epoch - 3ms/step
Epoch 34/100
558/558 - 2s - loss: 0.5524 - recall: 0.7342 - val_loss: 0.5750 - val_recall: 0.6687 - 2s/epoch - 3ms/step
Epoch 35/100
558/558 - 2s - loss: 0.5582 - recall: 0.7199 - val_loss: 0.5816 - val_recall: 0.6728 - 2s/epoch - 4ms/step
Epoch 36/100
558/558 - 3s - loss: 0.5531 - recall: 0.7268 - val_loss: 0.5851 - val_recall: 0.6769 - 3s/epoch - 5ms/step
Epoch 37/100
558/558 - 2s - loss: 0.5473 - recall: 0.7273 - val_loss: 0.5757 - val_recall: 0.6687 - 2s/epoch - 4ms/step
Epoch 38/100
558/558 - 2s - loss: 0.5465 - recall: 0.7320 - val_loss: 0.5785 - val_recall: 0.6769 - 2s/epoch - 3ms/step
Epoch 39/100
558/558 - 2s - loss: 0.5442 - recall: 0.7412 - val_loss: 0.5734 - val_recall: 0.6646 - 2s/epoch - 3ms/step
Epoch 40/100
558/558 - 2s - loss: 0.5440 - recall: 0.7372 - val_loss: 0.5691 - val_recall: 0.6605 - 2s/epoch - 3ms/step
Epoch 41/100
558/558 - 2s - loss: 0.5421 - recall: 0.7430 - val_loss: 0.5687 - val_recall: 0.6626 - 2s/epoch - 3ms/step
Epoch 42/100
558/558 - 2s - loss: 0.5395 - recall: 0.7374 - val_loss: 0.5774 - val_recall: 0.6728 - 2s/epoch - 3ms/step
Epoch 43/100
558/558 - 3s - loss: 0.5429 - recall: 0.7416 - val_loss: 0.5737 - val_recall: 0.6748 - 3s/epoch - 4ms/step
Epoch 44/100
558/558 - 3s - loss: 0.5381 - recall: 0.7423 - val_loss: 0.5627 - val_recall: 0.6605 - 3s/epoch - 6ms/step
Epoch 45/100
558/558 - 2s - loss: 0.5445 - recall: 0.7423 - val_loss: 0.5637 - val_recall: 0.6667 - 2s/epoch - 3ms/step
Epoch 46/100
558/558 - 2s - loss: 0.5361 - recall: 0.7396 - val_loss: 0.5749 - val_recall: 0.6769 - 2s/epoch - 3ms/step
Epoch 47/100
558/558 - 2s - loss: 0.5338 - recall: 0.7450 - val_loss: 0.5651 - val_recall: 0.6667 - 2s/epoch - 3ms/step
Epoch 48/100
558/558 - 1s - loss: 0.5430 - recall: 0.7349 - val_loss: 0.5652 - val_recall: 0.6687 - 1s/epoch - 3ms/step
Epoch 49/100
558/558 - 1s - loss: 0.5301 - recall: 0.7428 - val_loss: 0.5562 - val_recall: 0.6626 - 1s/epoch - 3ms/step
Epoch 50/100
558/558 - 2s - loss: 0.5313 - recall: 0.7533 - val_loss: 0.5600 - val_recall: 0.6646 - 2s/epoch - 3ms/step
Epoch 51/100
558/558 - 2s - loss: 0.5296 - recall: 0.7470 - val_loss: 0.5640 - val_recall: 0.6728 - 2s/epoch - 4ms/step
Epoch 52/100
558/558 - 3s - loss: 0.5323 - recall: 0.7486 - val_loss: 0.5636 - val_recall: 0.6708 - 3s/epoch - 5ms/step
Epoch 53/100
558/558 - 2s - loss: 0.5310 - recall: 0.7425 - val_loss: 0.5638 - val_recall: 0.6708 - 2s/epoch - 3ms/step
Epoch 54/100
558/558 - 1s - loss: 0.5272 - recall: 0.7470 - val_loss: 0.5543 - val_recall: 0.6626 - 1s/epoch - 3ms/step
Epoch 55/100
558/558 - 1s - loss: 0.5252 - recall: 0.7569 - val_loss: 0.5561 - val_recall: 0.6687 - 1s/epoch - 3ms/step
Epoch 56/100
558/558 - 1s - loss: 0.5217 - recall: 0.7558 - val_loss: 0.5577 - val_recall: 0.6748 - 1s/epoch - 3ms/step
Epoch 57/100
558/558 - 1s - loss: 0.5255 - recall: 0.7504 - val_loss: 0.5525 - val_recall: 0.6626 - 1s/epoch - 3ms/step
Epoch 58/100
558/558 - 1s - loss: 0.5260 - recall: 0.7475 - val_loss: 0.5467 - val_recall: 0.6605 - 1s/epoch - 3ms/step
Epoch 59/100
558/558 - 2s - loss: 0.5215 - recall: 0.7567 - val_loss: 0.5540 - val_recall: 0.6687 - 2s/epoch - 3ms/step
Epoch 60/100
558/558 - 3s - loss: 0.5204 - recall: 0.7558 - val_loss: 0.5510 - val_recall: 0.6748 - 3s/epoch - 5ms/step
Epoch 61/100
558/558 - 2s - loss: 0.5190 - recall: 0.7490 - val_loss: 0.5477 - val_recall: 0.6728 - 2s/epoch - 4ms/step
Epoch 62/100
558/558 - 1s - loss: 0.5226 - recall: 0.7497 - val_loss: 0.5547 - val_recall: 0.6769 - 1s/epoch - 2ms/step
Epoch 63/100
558/558 - 1s - loss: 0.5202 - recall: 0.7547 - val_loss: 0.5612 - val_recall: 0.6871 - 1s/epoch - 3ms/step
Epoch 64/100
558/558 - 2s - loss: 0.5178 - recall: 0.7538 - val_loss: 0.5574 - val_recall: 0.6789 - 2s/epoch - 3ms/step
Epoch 65/100
558/558 - 2s - loss: 0.5194 - recall: 0.7542 - val_loss: 0.5536 - val_recall: 0.6769 - 2s/epoch - 3ms/step
Epoch 66/100
558/558 - 2s - loss: 0.5199 - recall: 0.7578 - val_loss: 0.5467 - val_recall: 0.6728 - 2s/epoch - 3ms/step
Epoch 67/100
558/558 - 2s - loss: 0.5127 - recall: 0.7567 - val_loss: 0.5505 - val_recall: 0.6708 - 2s/epoch - 3ms/step
Epoch 68/100
558/558 - 2s - loss: 0.5160 - recall: 0.7558 - val_loss: 0.5336 - val_recall: 0.6524 - 2s/epoch - 4ms/step
Epoch 69/100
558/558 - 3s - loss: 0.5112 - recall: 0.7607 - val_loss: 0.5442 - val_recall: 0.6708 - 3s/epoch - 5ms/step
Epoch 70/100
558/558 - 1s - loss: 0.5143 - recall: 0.7607 - val_loss: 0.5512 - val_recall: 0.6789 - 1s/epoch - 3ms/step
Epoch 71/100
558/558 - 2s - loss: 0.5143 - recall: 0.7578 - val_loss: 0.5527 - val_recall: 0.6810 - 2s/epoch - 3ms/step
Epoch 72/100
558/558 - 2s - loss: 0.5125 - recall: 0.7600 - val_loss: 0.5434 - val_recall: 0.6748 - 2s/epoch - 3ms/step
Epoch 73/100
558/558 - 2s - loss: 0.5100 - recall: 0.7596 - val_loss: 0.5498 - val_recall: 0.6810 - 2s/epoch - 3ms/step
Epoch 74/100
558/558 - 2s - loss: 0.5064 - recall: 0.7634 - val_loss: 0.5376 - val_recall: 0.6667 - 2s/epoch - 3ms/step
Epoch 75/100
558/558 - 1s - loss: 0.5147 - recall: 0.7529 - val_loss: 0.5359 - val_recall: 0.6667 - 1s/epoch - 3ms/step
Epoch 76/100
558/558 - 3s - loss: 0.5089 - recall: 0.7560 - val_loss: 0.5449 - val_recall: 0.6830 - 3s/epoch - 5ms/step
Epoch 77/100
558/558 - 3s - loss: 0.5074 - recall: 0.7612 - val_loss: 0.5352 - val_recall: 0.6748 - 3s/epoch - 5ms/step
Epoch 78/100
558/558 - 2s - loss: 0.5075 - recall: 0.7603 - val_loss: 0.5478 - val_recall: 0.6810 - 2s/epoch - 3ms/step
Epoch 79/100
558/558 - 2s - loss: 0.5094 - recall: 0.7585 - val_loss: 0.5409 - val_recall: 0.6728 - 2s/epoch - 3ms/step
Epoch 80/100
558/558 - 1s - loss: 0.5125 - recall: 0.7547 - val_loss: 0.5444 - val_recall: 0.6810 - 1s/epoch - 3ms/step
Epoch 81/100
558/558 - 1s - loss: 0.5119 - recall: 0.7641 - val_loss: 0.5316 - val_recall: 0.6687 - 1s/epoch - 3ms/step
Epoch 82/100
558/558 - 2s - loss: 0.5052 - recall: 0.7670 - val_loss: 0.5453 - val_recall: 0.6769 - 2s/epoch - 3ms/step
Epoch 83/100
558/558 - 2s - loss: 0.5062 - recall: 0.7614 - val_loss: 0.5376 - val_recall: 0.6687 - 2s/epoch - 3ms/step
Epoch 84/100
558/558 - 2s - loss: 0.5028 - recall: 0.7650 - val_loss: 0.5374 - val_recall: 0.6748 - 2s/epoch - 4ms/step
Epoch 85/100
558/558 - 3s - loss: 0.5041 - recall: 0.7600 - val_loss: 0.5324 - val_recall: 0.6708 - 3s/epoch - 5ms/step
Epoch 86/100
558/558 - 2s - loss: 0.4954 - recall: 0.7760 - val_loss: 0.5391 - val_recall: 0.6769 - 2s/epoch - 3ms/step
Epoch 87/100
558/558 - 2s - loss: 0.5000 - recall: 0.7641 - val_loss: 0.5366 - val_recall: 0.6769 - 2s/epoch - 3ms/step
Epoch 88/100
558/558 - 1s - loss: 0.5041 - recall: 0.7681 - val_loss: 0.5401 - val_recall: 0.6748 - 1s/epoch - 2ms/step
Epoch 89/100
558/558 - 1s - loss: 0.5009 - recall: 0.7708 - val_loss: 0.5338 - val_recall: 0.6789 - 1s/epoch - 3ms/step
Epoch 90/100
558/558 - 1s - loss: 0.5012 - recall: 0.7661 - val_loss: 0.5420 - val_recall: 0.6892 - 1s/epoch - 3ms/step
Epoch 91/100
558/558 - 1s - loss: 0.4964 - recall: 0.7652 - val_loss: 0.5369 - val_recall: 0.6789 - 1s/epoch - 2ms/step
Epoch 92/100
558/558 - 2s - loss: 0.4947 - recall: 0.7630 - val_loss: 0.5342 - val_recall: 0.6789 - 2s/epoch - 3ms/step
Epoch 93/100
558/558 - 3s - loss: 0.4995 - recall: 0.7641 - val_loss: 0.5313 - val_recall: 0.6687 - 3s/epoch - 5ms/step
Epoch 94/100
558/558 - 3s - loss: 0.4948 - recall: 0.7625 - val_loss: 0.5423 - val_recall: 0.6851 - 3s/epoch - 5ms/step
Epoch 95/100
558/558 - 1s - loss: 0.4921 - recall: 0.7710 - val_loss: 0.5314 - val_recall: 0.6789 - 1s/epoch - 2ms/step
Epoch 96/100
558/558 - 2s - loss: 0.4924 - recall: 0.7753 - val_loss: 0.5333 - val_recall: 0.6728 - 2s/epoch - 3ms/step
Epoch 97/100
558/558 - 1s - loss: 0.4964 - recall: 0.7686 - val_loss: 0.5315 - val_recall: 0.6830 - 1s/epoch - 3ms/step
Epoch 98/100
558/558 - 1s - loss: 0.4941 - recall: 0.7748 - val_loss: 0.5296 - val_recall: 0.6830 - 1s/epoch - 2ms/step
Epoch 99/100
558/558 - 1s - loss: 0.4911 - recall: 0.7715 - val_loss: 0.5306 - val_recall: 0.6789 - 1s/epoch - 3ms/step
Epoch 100/100
558/558 - 1s - loss: 0.4929 - recall: 0.7764 - val_loss: 0.5236 - val_recall: 0.6667 - 1s/epoch - 3ms/step
plot(results.iloc[i]['history'],'loss')
plot(results.iloc[i]['history'],'recall')
results.iloc[i]
# hidden layers 2 # neurons - hidden layer [128, 128, 64] activation function - hidden layer [relu, relu] # epochs 100 batch size 16 optimizer SGD-Mom learning rate, momentum, dropout [1e-05, 0.9, 0] weight initializer he_uniform regularization - train loss 0.492929 validation loss 0.523605 train recall 0.776407 validation recall 0.666667 time (secs) 203.16 model <keras.src.engine.sequential.Sequential object... history <keras.src.callbacks.History object at 0x7dfce... Name: 8, dtype: object
SGD Optimizer with Momentum shows a good convergence in the loss gradience, Network built responded the same way for both Training and Validation data.
train recall 0.776407 & validation recall 0.666667
Recall score of the Neural Network is good for training but less for validation data.
i+=1
model_fit('relu','relu','RMS',X_train_over,y_train_over,100,i,learning_rte=1e-5,momentumval=0.9)
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 128) 1536
dense_1 (Dense) (None, 128) 16512
batch_normalization (Batch (None, 128) 512
Normalization)
dense_2 (Dense) (None, 64) 8256
batch_normalization_1 (Bat (None, 64) 256
chNormalization)
dense_3 (Dense) (None, 1) 65
=================================================================
Total params: 27137 (106.00 KB)
Trainable params: 26753 (104.50 KB)
Non-trainable params: 384 (1.50 KB)
_________________________________________________________________
Epoch 1/100
558/558 - 3s - loss: 0.6709 - recall: 0.6430 - val_loss: 0.5824 - val_recall: 0.6748 - 3s/epoch - 5ms/step
Epoch 2/100
558/558 - 1s - loss: 0.5339 - recall: 0.7421 - val_loss: 0.5513 - val_recall: 0.6728 - 1s/epoch - 3ms/step
Epoch 3/100
558/558 - 2s - loss: 0.4953 - recall: 0.7712 - val_loss: 0.5163 - val_recall: 0.6789 - 2s/epoch - 4ms/step
Epoch 4/100
558/558 - 2s - loss: 0.4699 - recall: 0.7802 - val_loss: 0.5031 - val_recall: 0.6667 - 2s/epoch - 4ms/step
Epoch 5/100
558/558 - 2s - loss: 0.4558 - recall: 0.7928 - val_loss: 0.5014 - val_recall: 0.6748 - 2s/epoch - 3ms/step
Epoch 6/100
558/558 - 1s - loss: 0.4414 - recall: 0.7975 - val_loss: 0.4849 - val_recall: 0.6667 - 1s/epoch - 3ms/step
Epoch 7/100
558/558 - 1s - loss: 0.4288 - recall: 0.8107 - val_loss: 0.4665 - val_recall: 0.6462 - 1s/epoch - 2ms/step
Epoch 8/100
558/558 - 1s - loss: 0.4196 - recall: 0.8067 - val_loss: 0.4823 - val_recall: 0.6646 - 1s/epoch - 3ms/step
Epoch 9/100
558/558 - 1s - loss: 0.4062 - recall: 0.8217 - val_loss: 0.4806 - val_recall: 0.6483 - 1s/epoch - 2ms/step
Epoch 10/100
558/558 - 1s - loss: 0.4001 - recall: 0.8204 - val_loss: 0.4947 - val_recall: 0.6667 - 1s/epoch - 3ms/step
Epoch 11/100
558/558 - 1s - loss: 0.3977 - recall: 0.8210 - val_loss: 0.4828 - val_recall: 0.6605 - 1s/epoch - 3ms/step
Epoch 12/100
558/558 - 2s - loss: 0.3852 - recall: 0.8320 - val_loss: 0.4746 - val_recall: 0.6442 - 2s/epoch - 4ms/step
Epoch 13/100
558/558 - 2s - loss: 0.3796 - recall: 0.8334 - val_loss: 0.4685 - val_recall: 0.6462 - 2s/epoch - 4ms/step
Epoch 14/100
558/558 - 2s - loss: 0.3744 - recall: 0.8410 - val_loss: 0.4806 - val_recall: 0.6626 - 2s/epoch - 3ms/step
Epoch 15/100
558/558 - 2s - loss: 0.3697 - recall: 0.8385 - val_loss: 0.4794 - val_recall: 0.6524 - 2s/epoch - 3ms/step
Epoch 16/100
558/558 - 1s - loss: 0.3652 - recall: 0.8410 - val_loss: 0.4851 - val_recall: 0.6544 - 1s/epoch - 2ms/step
Epoch 17/100
558/558 - 1s - loss: 0.3594 - recall: 0.8379 - val_loss: 0.4733 - val_recall: 0.6483 - 1s/epoch - 3ms/step
Epoch 18/100
558/558 - 1s - loss: 0.3556 - recall: 0.8455 - val_loss: 0.4966 - val_recall: 0.6708 - 1s/epoch - 3ms/step
Epoch 19/100
558/558 - 1s - loss: 0.3501 - recall: 0.8502 - val_loss: 0.4767 - val_recall: 0.6421 - 1s/epoch - 3ms/step
Epoch 20/100
558/558 - 1s - loss: 0.3460 - recall: 0.8536 - val_loss: 0.4807 - val_recall: 0.6442 - 1s/epoch - 3ms/step
Epoch 21/100
558/558 - 2s - loss: 0.3418 - recall: 0.8553 - val_loss: 0.4905 - val_recall: 0.6462 - 2s/epoch - 3ms/step
Epoch 22/100
558/558 - 2s - loss: 0.3436 - recall: 0.8547 - val_loss: 0.4867 - val_recall: 0.6442 - 2s/epoch - 4ms/step
Epoch 23/100
558/558 - 2s - loss: 0.3337 - recall: 0.8623 - val_loss: 0.5106 - val_recall: 0.6544 - 2s/epoch - 3ms/step
Epoch 24/100
558/558 - 2s - loss: 0.3304 - recall: 0.8724 - val_loss: 0.4932 - val_recall: 0.6401 - 2s/epoch - 3ms/step
Epoch 25/100
558/558 - 1s - loss: 0.3299 - recall: 0.8645 - val_loss: 0.4977 - val_recall: 0.6339 - 1s/epoch - 3ms/step
Epoch 26/100
558/558 - 1s - loss: 0.3213 - recall: 0.8645 - val_loss: 0.4921 - val_recall: 0.6401 - 1s/epoch - 3ms/step
Epoch 27/100
558/558 - 2s - loss: 0.3204 - recall: 0.8693 - val_loss: 0.5059 - val_recall: 0.6442 - 2s/epoch - 3ms/step
Epoch 28/100
558/558 - 1s - loss: 0.3152 - recall: 0.8715 - val_loss: 0.5187 - val_recall: 0.6380 - 1s/epoch - 2ms/step
Epoch 29/100
558/558 - 1s - loss: 0.3126 - recall: 0.8688 - val_loss: 0.5061 - val_recall: 0.6299 - 1s/epoch - 3ms/step
Epoch 30/100
558/558 - 2s - loss: 0.3146 - recall: 0.8690 - val_loss: 0.5124 - val_recall: 0.6196 - 2s/epoch - 4ms/step
Epoch 31/100
558/558 - 2s - loss: 0.3100 - recall: 0.8693 - val_loss: 0.5046 - val_recall: 0.6155 - 2s/epoch - 4ms/step
Epoch 32/100
558/558 - 1s - loss: 0.2916 - recall: 0.8814 - val_loss: 0.5283 - val_recall: 0.6421 - 1s/epoch - 3ms/step
Epoch 33/100
558/558 - 2s - loss: 0.3002 - recall: 0.8769 - val_loss: 0.5167 - val_recall: 0.6155 - 2s/epoch - 3ms/step
Epoch 34/100
558/558 - 1s - loss: 0.2927 - recall: 0.8825 - val_loss: 0.5179 - val_recall: 0.6012 - 1s/epoch - 2ms/step
Epoch 35/100
558/558 - 1s - loss: 0.2977 - recall: 0.8818 - val_loss: 0.5242 - val_recall: 0.6258 - 1s/epoch - 3ms/step
Epoch 36/100
558/558 - 1s - loss: 0.2994 - recall: 0.8791 - val_loss: 0.5422 - val_recall: 0.6462 - 1s/epoch - 3ms/step
Epoch 37/100
558/558 - 1s - loss: 0.2913 - recall: 0.8841 - val_loss: 0.5412 - val_recall: 0.6360 - 1s/epoch - 2ms/step
Epoch 38/100
558/558 - 2s - loss: 0.2840 - recall: 0.8845 - val_loss: 0.5393 - val_recall: 0.6258 - 2s/epoch - 3ms/step
Epoch 39/100
558/558 - 2s - loss: 0.2784 - recall: 0.8874 - val_loss: 0.5425 - val_recall: 0.6237 - 2s/epoch - 4ms/step
Epoch 40/100
558/558 - 2s - loss: 0.2810 - recall: 0.8861 - val_loss: 0.5477 - val_recall: 0.6421 - 2s/epoch - 4ms/step
Epoch 41/100
558/558 - 2s - loss: 0.2783 - recall: 0.8879 - val_loss: 0.5491 - val_recall: 0.6155 - 2s/epoch - 3ms/step
Epoch 42/100
558/558 - 2s - loss: 0.2720 - recall: 0.8957 - val_loss: 0.5655 - val_recall: 0.6380 - 2s/epoch - 3ms/step
Epoch 43/100
558/558 - 1s - loss: 0.2749 - recall: 0.8939 - val_loss: 0.5612 - val_recall: 0.6278 - 1s/epoch - 3ms/step
Epoch 44/100
558/558 - 1s - loss: 0.2652 - recall: 0.9000 - val_loss: 0.5564 - val_recall: 0.6196 - 1s/epoch - 2ms/step
Epoch 45/100
558/558 - 1s - loss: 0.2698 - recall: 0.8944 - val_loss: 0.5694 - val_recall: 0.6299 - 1s/epoch - 2ms/step
Epoch 46/100
558/558 - 1s - loss: 0.2607 - recall: 0.9004 - val_loss: 0.5826 - val_recall: 0.6299 - 1s/epoch - 2ms/step
Epoch 47/100
558/558 - 1s - loss: 0.2603 - recall: 0.8941 - val_loss: 0.5762 - val_recall: 0.6237 - 1s/epoch - 2ms/step
Epoch 48/100
558/558 - 2s - loss: 0.2636 - recall: 0.8982 - val_loss: 0.5781 - val_recall: 0.6217 - 2s/epoch - 4ms/step
Epoch 49/100
558/558 - 2s - loss: 0.2600 - recall: 0.8971 - val_loss: 0.5898 - val_recall: 0.6319 - 2s/epoch - 4ms/step
Epoch 50/100
558/558 - 2s - loss: 0.2504 - recall: 0.8995 - val_loss: 0.5788 - val_recall: 0.6196 - 2s/epoch - 3ms/step
Epoch 51/100
558/558 - 2s - loss: 0.2535 - recall: 0.9063 - val_loss: 0.5824 - val_recall: 0.6176 - 2s/epoch - 3ms/step
Epoch 52/100
558/558 - 1s - loss: 0.2515 - recall: 0.9040 - val_loss: 0.5834 - val_recall: 0.6115 - 1s/epoch - 3ms/step
Epoch 53/100
558/558 - 1s - loss: 0.2492 - recall: 0.9060 - val_loss: 0.5746 - val_recall: 0.5849 - 1s/epoch - 3ms/step
Epoch 54/100
558/558 - 1s - loss: 0.2444 - recall: 0.9058 - val_loss: 0.5929 - val_recall: 0.5869 - 1s/epoch - 3ms/step
Epoch 55/100
558/558 - 2s - loss: 0.2403 - recall: 0.9134 - val_loss: 0.5936 - val_recall: 0.6074 - 2s/epoch - 3ms/step
Epoch 56/100
558/558 - 1s - loss: 0.2441 - recall: 0.8995 - val_loss: 0.6024 - val_recall: 0.6217 - 1s/epoch - 3ms/step
Epoch 57/100
558/558 - 2s - loss: 0.2358 - recall: 0.9092 - val_loss: 0.6187 - val_recall: 0.6339 - 2s/epoch - 4ms/step
Epoch 58/100
558/558 - 2s - loss: 0.2398 - recall: 0.9107 - val_loss: 0.6024 - val_recall: 0.5685 - 2s/epoch - 4ms/step
Epoch 59/100
558/558 - 2s - loss: 0.2420 - recall: 0.9098 - val_loss: 0.6160 - val_recall: 0.6053 - 2s/epoch - 3ms/step
Epoch 60/100
558/558 - 1s - loss: 0.2318 - recall: 0.9125 - val_loss: 0.6149 - val_recall: 0.6033 - 1s/epoch - 3ms/step
Epoch 61/100
558/558 - 2s - loss: 0.2291 - recall: 0.9092 - val_loss: 0.6216 - val_recall: 0.6115 - 2s/epoch - 3ms/step
Epoch 62/100
558/558 - 1s - loss: 0.2295 - recall: 0.9159 - val_loss: 0.6406 - val_recall: 0.6115 - 1s/epoch - 2ms/step
Epoch 63/100
558/558 - 1s - loss: 0.2211 - recall: 0.9177 - val_loss: 0.6276 - val_recall: 0.5992 - 1s/epoch - 2ms/step
Epoch 64/100
558/558 - 1s - loss: 0.2273 - recall: 0.9168 - val_loss: 0.6472 - val_recall: 0.6237 - 1s/epoch - 2ms/step
Epoch 65/100
558/558 - 1s - loss: 0.2275 - recall: 0.9155 - val_loss: 0.6591 - val_recall: 0.6319 - 1s/epoch - 3ms/step
Epoch 66/100
558/558 - 2s - loss: 0.2274 - recall: 0.9123 - val_loss: 0.6398 - val_recall: 0.6033 - 2s/epoch - 3ms/step
Epoch 67/100
558/558 - 2s - loss: 0.2221 - recall: 0.9152 - val_loss: 0.6401 - val_recall: 0.6033 - 2s/epoch - 4ms/step
Epoch 68/100
558/558 - 2s - loss: 0.2245 - recall: 0.9143 - val_loss: 0.6373 - val_recall: 0.5951 - 2s/epoch - 4ms/step
Epoch 69/100
558/558 - 1s - loss: 0.2251 - recall: 0.9119 - val_loss: 0.6584 - val_recall: 0.5890 - 1s/epoch - 3ms/step
Epoch 70/100
558/558 - 2s - loss: 0.2158 - recall: 0.9148 - val_loss: 0.6543 - val_recall: 0.5685 - 2s/epoch - 3ms/step
Epoch 71/100
558/558 - 1s - loss: 0.2207 - recall: 0.9206 - val_loss: 0.6775 - val_recall: 0.6339 - 1s/epoch - 3ms/step
Epoch 72/100
558/558 - 1s - loss: 0.2106 - recall: 0.9157 - val_loss: 0.6527 - val_recall: 0.5828 - 1s/epoch - 2ms/step
Epoch 73/100
558/558 - 1s - loss: 0.2099 - recall: 0.9199 - val_loss: 0.6655 - val_recall: 0.6074 - 1s/epoch - 3ms/step
Epoch 74/100
558/558 - 1s - loss: 0.2167 - recall: 0.9163 - val_loss: 0.6566 - val_recall: 0.5869 - 1s/epoch - 3ms/step
Epoch 75/100
558/558 - 2s - loss: 0.2238 - recall: 0.9163 - val_loss: 0.6385 - val_recall: 0.5706 - 2s/epoch - 3ms/step
Epoch 76/100
558/558 - 2s - loss: 0.2084 - recall: 0.9269 - val_loss: 0.6701 - val_recall: 0.5992 - 2s/epoch - 4ms/step
Epoch 77/100
558/558 - 2s - loss: 0.2098 - recall: 0.9231 - val_loss: 0.6612 - val_recall: 0.5930 - 2s/epoch - 4ms/step
Epoch 78/100
558/558 - 1s - loss: 0.2040 - recall: 0.9271 - val_loss: 0.6594 - val_recall: 0.6033 - 1s/epoch - 2ms/step
Epoch 79/100
558/558 - 1s - loss: 0.2019 - recall: 0.9235 - val_loss: 0.6609 - val_recall: 0.5767 - 1s/epoch - 3ms/step
Epoch 80/100
558/558 - 1s - loss: 0.2084 - recall: 0.9177 - val_loss: 0.6576 - val_recall: 0.5971 - 1s/epoch - 3ms/step
Epoch 81/100
558/558 - 1s - loss: 0.2076 - recall: 0.9217 - val_loss: 0.6816 - val_recall: 0.6012 - 1s/epoch - 3ms/step
Epoch 82/100
558/558 - 2s - loss: 0.2085 - recall: 0.9242 - val_loss: 0.6781 - val_recall: 0.6012 - 2s/epoch - 3ms/step
Epoch 83/100
558/558 - 2s - loss: 0.2039 - recall: 0.9240 - val_loss: 0.6849 - val_recall: 0.5644 - 2s/epoch - 3ms/step
Epoch 84/100
558/558 - 2s - loss: 0.2044 - recall: 0.9226 - val_loss: 0.6868 - val_recall: 0.5971 - 2s/epoch - 3ms/step
Epoch 85/100
558/558 - 2s - loss: 0.2002 - recall: 0.9229 - val_loss: 0.6986 - val_recall: 0.5869 - 2s/epoch - 4ms/step
Epoch 86/100
558/558 - 2s - loss: 0.1961 - recall: 0.9320 - val_loss: 0.6952 - val_recall: 0.5706 - 2s/epoch - 4ms/step
Epoch 87/100
558/558 - 2s - loss: 0.1887 - recall: 0.9325 - val_loss: 0.7035 - val_recall: 0.5767 - 2s/epoch - 3ms/step
Epoch 88/100
558/558 - 1s - loss: 0.1947 - recall: 0.9231 - val_loss: 0.7131 - val_recall: 0.5971 - 1s/epoch - 3ms/step
Epoch 89/100
558/558 - 1s - loss: 0.1926 - recall: 0.9307 - val_loss: 0.7187 - val_recall: 0.6053 - 1s/epoch - 3ms/step
Epoch 90/100
558/558 - 1s - loss: 0.1963 - recall: 0.9282 - val_loss: 0.6949 - val_recall: 0.5706 - 1s/epoch - 3ms/step
Epoch 91/100
558/558 - 1s - loss: 0.1908 - recall: 0.9242 - val_loss: 0.7025 - val_recall: 0.5746 - 1s/epoch - 3ms/step
Epoch 92/100
558/558 - 2s - loss: 0.1928 - recall: 0.9323 - val_loss: 0.7016 - val_recall: 0.5603 - 2s/epoch - 3ms/step
Epoch 93/100
558/558 - 2s - loss: 0.1922 - recall: 0.9271 - val_loss: 0.6993 - val_recall: 0.5583 - 2s/epoch - 3ms/step
Epoch 94/100
558/558 - 2s - loss: 0.1785 - recall: 0.9403 - val_loss: 0.7441 - val_recall: 0.6115 - 2s/epoch - 4ms/step
Epoch 95/100
558/558 - 3s - loss: 0.1845 - recall: 0.9327 - val_loss: 0.7491 - val_recall: 0.6094 - 3s/epoch - 5ms/step
Epoch 96/100
558/558 - 1s - loss: 0.1839 - recall: 0.9370 - val_loss: 0.7281 - val_recall: 0.5849 - 1s/epoch - 3ms/step
Epoch 97/100
558/558 - 1s - loss: 0.1756 - recall: 0.9394 - val_loss: 0.7378 - val_recall: 0.5787 - 1s/epoch - 3ms/step
Epoch 98/100
558/558 - 2s - loss: 0.1880 - recall: 0.9262 - val_loss: 0.7393 - val_recall: 0.5828 - 2s/epoch - 3ms/step
Epoch 99/100
558/558 - 1s - loss: 0.1882 - recall: 0.9273 - val_loss: 0.7327 - val_recall: 0.5767 - 1s/epoch - 3ms/step
Epoch 100/100
558/558 - 2s - loss: 0.1811 - recall: 0.9318 - val_loss: 0.7462 - val_recall: 0.5930 - 2s/epoch - 3ms/step
plot(results.iloc[i]['history'],'loss')
plot(results.iloc[i]['history'],'recall')
results.iloc[i]
# hidden layers 2 # neurons - hidden layer [128, 128, 64] activation function - hidden layer [relu, relu] # epochs 100 batch size 16 optimizer RMS learning rate, momentum, dropout [1e-05, 0.9, 0] weight initializer he_uniform regularization - train loss 0.181092 validation loss 0.746226 train recall 0.931823 validation recall 0.593047 time (secs) 203.36 model <keras.src.engine.sequential.Sequential object... history <keras.src.callbacks.History object at 0x7dfce... Name: 9, dtype: object
Trained Neural network is behaving very differently for the validation data, the loss shows a different path and not following the path created by Training data.
train recall 0.931823 & validation recall 0.593047
The recall score for training data is very high where as the validation scores are less showing a lot of difference. This potentially means the model overfit the data. So not a good model.
i+=1
model_fit('relu','relu','Adam',X_train_over,y_train_over,100,i,learning_rte=1e-5)
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 128) 1536
dense_1 (Dense) (None, 128) 16512
batch_normalization (Batch (None, 128) 512
Normalization)
dense_2 (Dense) (None, 64) 8256
batch_normalization_1 (Bat (None, 64) 256
chNormalization)
dense_3 (Dense) (None, 1) 65
=================================================================
Total params: 27137 (106.00 KB)
Trainable params: 26753 (104.50 KB)
Non-trainable params: 384 (1.50 KB)
_________________________________________________________________
Epoch 1/100
558/558 - 3s - loss: 0.9524 - recall: 0.5026 - val_loss: 0.8809 - val_recall: 0.5378 - 3s/epoch - 6ms/step
Epoch 2/100
558/558 - 2s - loss: 0.7859 - recall: 0.5755 - val_loss: 0.7722 - val_recall: 0.5808 - 2s/epoch - 4ms/step
Epoch 3/100
558/558 - 2s - loss: 0.6981 - recall: 0.6196 - val_loss: 0.7008 - val_recall: 0.6196 - 2s/epoch - 4ms/step
Epoch 4/100
558/558 - 1s - loss: 0.6531 - recall: 0.6513 - val_loss: 0.6746 - val_recall: 0.6319 - 1s/epoch - 3ms/step
Epoch 5/100
558/558 - 1s - loss: 0.6276 - recall: 0.6535 - val_loss: 0.6586 - val_recall: 0.6585 - 1s/epoch - 3ms/step
Epoch 6/100
558/558 - 1s - loss: 0.6060 - recall: 0.6708 - val_loss: 0.6309 - val_recall: 0.6544 - 1s/epoch - 3ms/step
Epoch 7/100
558/558 - 1s - loss: 0.5917 - recall: 0.6903 - val_loss: 0.6068 - val_recall: 0.6442 - 1s/epoch - 3ms/step
Epoch 8/100
558/558 - 1s - loss: 0.5788 - recall: 0.7040 - val_loss: 0.5989 - val_recall: 0.6462 - 1s/epoch - 3ms/step
Epoch 9/100
558/558 - 1s - loss: 0.5669 - recall: 0.7058 - val_loss: 0.5911 - val_recall: 0.6483 - 1s/epoch - 3ms/step
Epoch 10/100
558/558 - 2s - loss: 0.5576 - recall: 0.7248 - val_loss: 0.5877 - val_recall: 0.6626 - 2s/epoch - 4ms/step
Epoch 11/100
558/558 - 2s - loss: 0.5549 - recall: 0.7201 - val_loss: 0.5812 - val_recall: 0.6585 - 2s/epoch - 4ms/step
Epoch 12/100
558/558 - 2s - loss: 0.5503 - recall: 0.7271 - val_loss: 0.5667 - val_recall: 0.6585 - 2s/epoch - 3ms/step
Epoch 13/100
558/558 - 1s - loss: 0.5363 - recall: 0.7387 - val_loss: 0.5599 - val_recall: 0.6585 - 1s/epoch - 2ms/step
Epoch 14/100
558/558 - 1s - loss: 0.5304 - recall: 0.7479 - val_loss: 0.5710 - val_recall: 0.6748 - 1s/epoch - 3ms/step
Epoch 15/100
558/558 - 1s - loss: 0.5276 - recall: 0.7488 - val_loss: 0.5605 - val_recall: 0.6646 - 1s/epoch - 3ms/step
Epoch 16/100
558/558 - 2s - loss: 0.5227 - recall: 0.7499 - val_loss: 0.5650 - val_recall: 0.6728 - 2s/epoch - 3ms/step
Epoch 17/100
558/558 - 2s - loss: 0.5137 - recall: 0.7526 - val_loss: 0.5604 - val_recall: 0.6728 - 2s/epoch - 3ms/step
Epoch 18/100
558/558 - 1s - loss: 0.5184 - recall: 0.7488 - val_loss: 0.5532 - val_recall: 0.6810 - 1s/epoch - 3ms/step
Epoch 19/100
558/558 - 2s - loss: 0.5101 - recall: 0.7591 - val_loss: 0.5392 - val_recall: 0.6687 - 2s/epoch - 3ms/step
Epoch 20/100
558/558 - 2s - loss: 0.5039 - recall: 0.7641 - val_loss: 0.5388 - val_recall: 0.6769 - 2s/epoch - 4ms/step
Epoch 21/100
558/558 - 2s - loss: 0.5039 - recall: 0.7690 - val_loss: 0.5402 - val_recall: 0.6708 - 2s/epoch - 3ms/step
Epoch 22/100
558/558 - 1s - loss: 0.4991 - recall: 0.7656 - val_loss: 0.5407 - val_recall: 0.6830 - 1s/epoch - 3ms/step
Epoch 23/100
558/558 - 1s - loss: 0.4953 - recall: 0.7652 - val_loss: 0.5408 - val_recall: 0.6769 - 1s/epoch - 3ms/step
Epoch 24/100
558/558 - 2s - loss: 0.4933 - recall: 0.7699 - val_loss: 0.5263 - val_recall: 0.6687 - 2s/epoch - 3ms/step
Epoch 25/100
558/558 - 2s - loss: 0.4886 - recall: 0.7742 - val_loss: 0.5224 - val_recall: 0.6687 - 2s/epoch - 3ms/step
Epoch 26/100
558/558 - 2s - loss: 0.4859 - recall: 0.7753 - val_loss: 0.5178 - val_recall: 0.6667 - 2s/epoch - 3ms/step
Epoch 27/100
558/558 - 2s - loss: 0.4828 - recall: 0.7746 - val_loss: 0.5246 - val_recall: 0.6769 - 2s/epoch - 3ms/step
Epoch 28/100
558/558 - 2s - loss: 0.4814 - recall: 0.7719 - val_loss: 0.5248 - val_recall: 0.6769 - 2s/epoch - 4ms/step
Epoch 29/100
558/558 - 2s - loss: 0.4717 - recall: 0.7800 - val_loss: 0.5187 - val_recall: 0.6708 - 2s/epoch - 4ms/step
Epoch 30/100
558/558 - 2s - loss: 0.4781 - recall: 0.7867 - val_loss: 0.5132 - val_recall: 0.6708 - 2s/epoch - 3ms/step
Epoch 31/100
558/558 - 2s - loss: 0.4761 - recall: 0.7773 - val_loss: 0.5100 - val_recall: 0.6646 - 2s/epoch - 3ms/step
Epoch 32/100
558/558 - 1s - loss: 0.4649 - recall: 0.7847 - val_loss: 0.5141 - val_recall: 0.6626 - 1s/epoch - 2ms/step
Epoch 33/100
558/558 - 2s - loss: 0.4649 - recall: 0.7836 - val_loss: 0.5125 - val_recall: 0.6667 - 2s/epoch - 3ms/step
Epoch 34/100
558/558 - 1s - loss: 0.4622 - recall: 0.7959 - val_loss: 0.5035 - val_recall: 0.6626 - 1s/epoch - 3ms/step
Epoch 35/100
558/558 - 1s - loss: 0.4672 - recall: 0.7811 - val_loss: 0.5117 - val_recall: 0.6687 - 1s/epoch - 2ms/step
Epoch 36/100
558/558 - 1s - loss: 0.4641 - recall: 0.7885 - val_loss: 0.5173 - val_recall: 0.6810 - 1s/epoch - 3ms/step
Epoch 37/100
558/558 - 2s - loss: 0.4581 - recall: 0.7899 - val_loss: 0.5073 - val_recall: 0.6687 - 2s/epoch - 4ms/step
Epoch 38/100
558/558 - 2s - loss: 0.4554 - recall: 0.7917 - val_loss: 0.5076 - val_recall: 0.6769 - 2s/epoch - 4ms/step
Epoch 39/100
558/558 - 2s - loss: 0.4510 - recall: 0.7928 - val_loss: 0.5036 - val_recall: 0.6667 - 2s/epoch - 3ms/step
Epoch 40/100
558/558 - 1s - loss: 0.4522 - recall: 0.7984 - val_loss: 0.4962 - val_recall: 0.6687 - 1s/epoch - 2ms/step
Epoch 41/100
558/558 - 1s - loss: 0.4515 - recall: 0.7977 - val_loss: 0.4974 - val_recall: 0.6748 - 1s/epoch - 3ms/step
Epoch 42/100
558/558 - 1s - loss: 0.4444 - recall: 0.8000 - val_loss: 0.5068 - val_recall: 0.6769 - 1s/epoch - 3ms/step
Epoch 43/100
558/558 - 1s - loss: 0.4500 - recall: 0.7943 - val_loss: 0.5045 - val_recall: 0.6748 - 1s/epoch - 2ms/step
Epoch 44/100
558/558 - 1s - loss: 0.4417 - recall: 0.7997 - val_loss: 0.4944 - val_recall: 0.6708 - 1s/epoch - 3ms/step
Epoch 45/100
558/558 - 1s - loss: 0.4480 - recall: 0.8004 - val_loss: 0.4942 - val_recall: 0.6667 - 1s/epoch - 3ms/step
Epoch 46/100
558/558 - 2s - loss: 0.4387 - recall: 0.8040 - val_loss: 0.5032 - val_recall: 0.6748 - 2s/epoch - 3ms/step
Epoch 47/100
558/558 - 3s - loss: 0.4360 - recall: 0.8026 - val_loss: 0.4961 - val_recall: 0.6687 - 3s/epoch - 4ms/step
Epoch 48/100
558/558 - 2s - loss: 0.4441 - recall: 0.8029 - val_loss: 0.4969 - val_recall: 0.6708 - 2s/epoch - 3ms/step
Epoch 49/100
558/558 - 1s - loss: 0.4350 - recall: 0.8015 - val_loss: 0.4909 - val_recall: 0.6667 - 1s/epoch - 3ms/step
Epoch 50/100
558/558 - 1s - loss: 0.4340 - recall: 0.8080 - val_loss: 0.4907 - val_recall: 0.6687 - 1s/epoch - 2ms/step
Epoch 51/100
558/558 - 1s - loss: 0.4308 - recall: 0.8103 - val_loss: 0.4947 - val_recall: 0.6748 - 1s/epoch - 3ms/step
Epoch 52/100
558/558 - 1s - loss: 0.4312 - recall: 0.8026 - val_loss: 0.4948 - val_recall: 0.6667 - 1s/epoch - 3ms/step
Epoch 53/100
558/558 - 2s - loss: 0.4331 - recall: 0.8065 - val_loss: 0.4911 - val_recall: 0.6626 - 2s/epoch - 3ms/step
Epoch 54/100
558/558 - 1s - loss: 0.4262 - recall: 0.8105 - val_loss: 0.4847 - val_recall: 0.6544 - 1s/epoch - 3ms/step
Epoch 55/100
558/558 - 2s - loss: 0.4252 - recall: 0.8083 - val_loss: 0.4876 - val_recall: 0.6626 - 2s/epoch - 3ms/step
Epoch 56/100
558/558 - 2s - loss: 0.4221 - recall: 0.8195 - val_loss: 0.4898 - val_recall: 0.6626 - 2s/epoch - 4ms/step
Epoch 57/100
558/558 - 2s - loss: 0.4225 - recall: 0.8121 - val_loss: 0.4854 - val_recall: 0.6544 - 2s/epoch - 3ms/step
Epoch 58/100
558/558 - 1s - loss: 0.4244 - recall: 0.8127 - val_loss: 0.4784 - val_recall: 0.6524 - 1s/epoch - 2ms/step
Epoch 59/100
558/558 - 1s - loss: 0.4201 - recall: 0.8157 - val_loss: 0.4874 - val_recall: 0.6564 - 1s/epoch - 3ms/step
Epoch 60/100
558/558 - 1s - loss: 0.4183 - recall: 0.8103 - val_loss: 0.4837 - val_recall: 0.6564 - 1s/epoch - 2ms/step
Epoch 61/100
558/558 - 1s - loss: 0.4146 - recall: 0.8107 - val_loss: 0.4831 - val_recall: 0.6564 - 1s/epoch - 3ms/step
Epoch 62/100
558/558 - 1s - loss: 0.4183 - recall: 0.8100 - val_loss: 0.4889 - val_recall: 0.6626 - 1s/epoch - 2ms/step
Epoch 63/100
558/558 - 1s - loss: 0.4147 - recall: 0.8121 - val_loss: 0.4933 - val_recall: 0.6728 - 1s/epoch - 3ms/step
Epoch 64/100
558/558 - 2s - loss: 0.4144 - recall: 0.8168 - val_loss: 0.4919 - val_recall: 0.6646 - 2s/epoch - 3ms/step
Epoch 65/100
558/558 - 2s - loss: 0.4142 - recall: 0.8222 - val_loss: 0.4882 - val_recall: 0.6605 - 2s/epoch - 4ms/step
Epoch 66/100
558/558 - 2s - loss: 0.4139 - recall: 0.8143 - val_loss: 0.4803 - val_recall: 0.6483 - 2s/epoch - 4ms/step
Epoch 67/100
558/558 - 1s - loss: 0.4086 - recall: 0.8208 - val_loss: 0.4842 - val_recall: 0.6503 - 1s/epoch - 2ms/step
Epoch 68/100
558/558 - 1s - loss: 0.4097 - recall: 0.8186 - val_loss: 0.4726 - val_recall: 0.6360 - 1s/epoch - 3ms/step
Epoch 69/100
558/558 - 1s - loss: 0.4084 - recall: 0.8154 - val_loss: 0.4819 - val_recall: 0.6421 - 1s/epoch - 3ms/step
Epoch 70/100
558/558 - 1s - loss: 0.4062 - recall: 0.8217 - val_loss: 0.4846 - val_recall: 0.6503 - 1s/epoch - 3ms/step
Epoch 71/100
558/558 - 1s - loss: 0.4054 - recall: 0.8213 - val_loss: 0.4925 - val_recall: 0.6626 - 1s/epoch - 2ms/step
Epoch 72/100
558/558 - 1s - loss: 0.4032 - recall: 0.8208 - val_loss: 0.4760 - val_recall: 0.6483 - 1s/epoch - 2ms/step
Epoch 73/100
558/558 - 1s - loss: 0.4029 - recall: 0.8210 - val_loss: 0.4831 - val_recall: 0.6585 - 1s/epoch - 3ms/step
Epoch 74/100
558/558 - 2s - loss: 0.3995 - recall: 0.8260 - val_loss: 0.4730 - val_recall: 0.6380 - 2s/epoch - 4ms/step
Epoch 75/100
558/558 - 2s - loss: 0.4097 - recall: 0.8130 - val_loss: 0.4729 - val_recall: 0.6360 - 2s/epoch - 4ms/step
Epoch 76/100
558/558 - 1s - loss: 0.4007 - recall: 0.8217 - val_loss: 0.4867 - val_recall: 0.6605 - 1s/epoch - 3ms/step
Epoch 77/100
558/558 - 1s - loss: 0.3957 - recall: 0.8356 - val_loss: 0.4748 - val_recall: 0.6524 - 1s/epoch - 3ms/step
Epoch 78/100
558/558 - 2s - loss: 0.3981 - recall: 0.8269 - val_loss: 0.4855 - val_recall: 0.6564 - 2s/epoch - 3ms/step
Epoch 79/100
558/558 - 2s - loss: 0.3977 - recall: 0.8253 - val_loss: 0.4804 - val_recall: 0.6524 - 2s/epoch - 3ms/step
Epoch 80/100
558/558 - 1s - loss: 0.4021 - recall: 0.8237 - val_loss: 0.4834 - val_recall: 0.6524 - 1s/epoch - 2ms/step
Epoch 81/100
558/558 - 2s - loss: 0.4033 - recall: 0.8242 - val_loss: 0.4713 - val_recall: 0.6380 - 2s/epoch - 3ms/step
Epoch 82/100
558/558 - 2s - loss: 0.3983 - recall: 0.8257 - val_loss: 0.4869 - val_recall: 0.6605 - 2s/epoch - 3ms/step
Epoch 83/100
558/558 - 2s - loss: 0.3961 - recall: 0.8240 - val_loss: 0.4743 - val_recall: 0.6380 - 2s/epoch - 4ms/step
Epoch 84/100
558/558 - 2s - loss: 0.3950 - recall: 0.8255 - val_loss: 0.4761 - val_recall: 0.6442 - 2s/epoch - 4ms/step
Epoch 85/100
558/558 - 2s - loss: 0.3908 - recall: 0.8273 - val_loss: 0.4744 - val_recall: 0.6401 - 2s/epoch - 3ms/step
Epoch 86/100
558/558 - 1s - loss: 0.3865 - recall: 0.8311 - val_loss: 0.4784 - val_recall: 0.6503 - 1s/epoch - 3ms/step
Epoch 87/100
558/558 - 1s - loss: 0.3881 - recall: 0.8287 - val_loss: 0.4771 - val_recall: 0.6524 - 1s/epoch - 3ms/step
Epoch 88/100
558/558 - 1s - loss: 0.3914 - recall: 0.8291 - val_loss: 0.4843 - val_recall: 0.6564 - 1s/epoch - 2ms/step
Epoch 89/100
558/558 - 1s - loss: 0.3873 - recall: 0.8296 - val_loss: 0.4751 - val_recall: 0.6421 - 1s/epoch - 3ms/step
Epoch 90/100
558/558 - 1s - loss: 0.3880 - recall: 0.8291 - val_loss: 0.4864 - val_recall: 0.6605 - 1s/epoch - 2ms/step
Epoch 91/100
558/558 - 1s - loss: 0.3833 - recall: 0.8311 - val_loss: 0.4833 - val_recall: 0.6462 - 1s/epoch - 2ms/step
Epoch 92/100
558/558 - 2s - loss: 0.3843 - recall: 0.8322 - val_loss: 0.4793 - val_recall: 0.6421 - 2s/epoch - 3ms/step
Epoch 93/100
558/558 - 2s - loss: 0.3874 - recall: 0.8287 - val_loss: 0.4754 - val_recall: 0.6360 - 2s/epoch - 4ms/step
Epoch 94/100
558/558 - 2s - loss: 0.3784 - recall: 0.8322 - val_loss: 0.4925 - val_recall: 0.6524 - 2s/epoch - 4ms/step
Epoch 95/100
558/558 - 1s - loss: 0.3814 - recall: 0.8325 - val_loss: 0.4804 - val_recall: 0.6564 - 1s/epoch - 2ms/step
Epoch 96/100
558/558 - 1s - loss: 0.3798 - recall: 0.8340 - val_loss: 0.4805 - val_recall: 0.6401 - 1s/epoch - 3ms/step
Epoch 97/100
558/558 - 1s - loss: 0.3801 - recall: 0.8390 - val_loss: 0.4847 - val_recall: 0.6462 - 1s/epoch - 3ms/step
Epoch 98/100
558/558 - 1s - loss: 0.3806 - recall: 0.8363 - val_loss: 0.4770 - val_recall: 0.6442 - 1s/epoch - 2ms/step
Epoch 99/100
558/558 - 1s - loss: 0.3785 - recall: 0.8354 - val_loss: 0.4817 - val_recall: 0.6503 - 1s/epoch - 2ms/step
Epoch 100/100
558/558 - 2s - loss: 0.3774 - recall: 0.8374 - val_loss: 0.4771 - val_recall: 0.6360 - 2s/epoch - 3ms/step
plot(results.iloc[i]['history'],'loss')
plot(results.iloc[i]['history'],'recall')
results.iloc[i]
# hidden layers 2 # neurons - hidden layer [128, 128, 64] activation function - hidden layer [relu, relu] # epochs 100 batch size 16 optimizer Adam learning rate, momentum, dropout [1e-05, 0.0, 0] weight initializer he_uniform regularization - train loss 0.377406 validation loss 0.477113 train recall 0.837407 validation recall 0.635992 time (secs) 165.31 model <keras.src.engine.sequential.Sequential object... history <keras.src.callbacks.History object at 0x7dfcd... Name: 10, dtype: object
Neural network with the Adam optimizer shows a good convergence in the loss gradience, Network built responded the same way for both Training and Validation data.
train recall :0.837407 & validation recall : 0.635992
Recall score of training data from the Neural Network is good but slightly less for validation data. The difference is not optimal value, hence not the best model.
i+=1
model_fit_with_dropout('relu','relu','Adam',X_train_over,y_train_over,100,i,dropoutval=0.2,learning_rte=1e-5)
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 128) 1536
dropout (Dropout) (None, 128) 0
dense_1 (Dense) (None, 128) 16512
batch_normalization (Batch (None, 128) 512
Normalization)
dropout_1 (Dropout) (None, 128) 0
dense_2 (Dense) (None, 64) 8256
batch_normalization_1 (Bat (None, 64) 256
chNormalization)
dropout_2 (Dropout) (None, 64) 0
dense_3 (Dense) (None, 1) 65
=================================================================
Total params: 27137 (106.00 KB)
Trainable params: 26753 (104.50 KB)
Non-trainable params: 384 (1.50 KB)
_________________________________________________________________
Epoch 1/100
558/558 - 5s - loss: 0.9871 - recall: 0.4950 - val_loss: 0.7654 - val_recall: 0.4254 - 5s/epoch - 9ms/step
Epoch 2/100
558/558 - 1s - loss: 0.9071 - recall: 0.5192 - val_loss: 0.7007 - val_recall: 0.4642 - 1s/epoch - 3ms/step
Epoch 3/100
558/558 - 1s - loss: 0.8493 - recall: 0.5474 - val_loss: 0.6576 - val_recall: 0.5051 - 1s/epoch - 3ms/step
Epoch 4/100
558/558 - 1s - loss: 0.8206 - recall: 0.5656 - val_loss: 0.6357 - val_recall: 0.5378 - 1s/epoch - 3ms/step
Epoch 5/100
558/558 - 1s - loss: 0.7898 - recall: 0.5793 - val_loss: 0.6257 - val_recall: 0.5869 - 1s/epoch - 3ms/step
Epoch 6/100
558/558 - 1s - loss: 0.7581 - recall: 0.6073 - val_loss: 0.6045 - val_recall: 0.5890 - 1s/epoch - 3ms/step
Epoch 7/100
558/558 - 2s - loss: 0.7502 - recall: 0.6022 - val_loss: 0.5836 - val_recall: 0.5890 - 2s/epoch - 3ms/step
Epoch 8/100
558/558 - 2s - loss: 0.7349 - recall: 0.6136 - val_loss: 0.5763 - val_recall: 0.6012 - 2s/epoch - 3ms/step
Epoch 9/100
558/558 - 2s - loss: 0.7181 - recall: 0.6248 - val_loss: 0.5726 - val_recall: 0.6135 - 2s/epoch - 4ms/step
Epoch 10/100
558/558 - 2s - loss: 0.7087 - recall: 0.6286 - val_loss: 0.5660 - val_recall: 0.6176 - 2s/epoch - 3ms/step
Epoch 11/100
558/558 - 2s - loss: 0.7008 - recall: 0.6427 - val_loss: 0.5642 - val_recall: 0.6299 - 2s/epoch - 3ms/step
Epoch 12/100
558/558 - 2s - loss: 0.6888 - recall: 0.6457 - val_loss: 0.5580 - val_recall: 0.6278 - 2s/epoch - 3ms/step
Epoch 13/100
558/558 - 2s - loss: 0.6776 - recall: 0.6396 - val_loss: 0.5561 - val_recall: 0.6401 - 2s/epoch - 3ms/step
Epoch 14/100
558/558 - 2s - loss: 0.6788 - recall: 0.6407 - val_loss: 0.5707 - val_recall: 0.6626 - 2s/epoch - 3ms/step
Epoch 15/100
558/558 - 2s - loss: 0.6710 - recall: 0.6479 - val_loss: 0.5557 - val_recall: 0.6503 - 2s/epoch - 3ms/step
Epoch 16/100
558/558 - 2s - loss: 0.6739 - recall: 0.6463 - val_loss: 0.5607 - val_recall: 0.6585 - 2s/epoch - 3ms/step
Epoch 17/100
558/558 - 2s - loss: 0.6573 - recall: 0.6611 - val_loss: 0.5590 - val_recall: 0.6585 - 2s/epoch - 4ms/step
Epoch 18/100
558/558 - 2s - loss: 0.6635 - recall: 0.6609 - val_loss: 0.5589 - val_recall: 0.6769 - 2s/epoch - 4ms/step
Epoch 19/100
558/558 - 2s - loss: 0.6613 - recall: 0.6634 - val_loss: 0.5499 - val_recall: 0.6687 - 2s/epoch - 3ms/step
Epoch 20/100
558/558 - 1s - loss: 0.6477 - recall: 0.6739 - val_loss: 0.5443 - val_recall: 0.6585 - 1s/epoch - 3ms/step
Epoch 21/100
558/558 - 2s - loss: 0.6478 - recall: 0.6616 - val_loss: 0.5531 - val_recall: 0.6789 - 2s/epoch - 3ms/step
Epoch 22/100
558/558 - 2s - loss: 0.6499 - recall: 0.6625 - val_loss: 0.5488 - val_recall: 0.6728 - 2s/epoch - 3ms/step
Epoch 23/100
558/558 - 2s - loss: 0.6479 - recall: 0.6694 - val_loss: 0.5536 - val_recall: 0.6871 - 2s/epoch - 3ms/step
Epoch 24/100
558/558 - 1s - loss: 0.6383 - recall: 0.6824 - val_loss: 0.5401 - val_recall: 0.6708 - 1s/epoch - 3ms/step
Epoch 25/100
558/558 - 2s - loss: 0.6443 - recall: 0.6739 - val_loss: 0.5377 - val_recall: 0.6728 - 2s/epoch - 3ms/step
Epoch 26/100
558/558 - 2s - loss: 0.6395 - recall: 0.6715 - val_loss: 0.5367 - val_recall: 0.6769 - 2s/epoch - 4ms/step
Epoch 27/100
558/558 - 3s - loss: 0.6414 - recall: 0.6663 - val_loss: 0.5402 - val_recall: 0.6830 - 3s/epoch - 5ms/step
Epoch 28/100
558/558 - 2s - loss: 0.6370 - recall: 0.6764 - val_loss: 0.5382 - val_recall: 0.6871 - 2s/epoch - 3ms/step
Epoch 29/100
558/558 - 1s - loss: 0.6291 - recall: 0.6737 - val_loss: 0.5344 - val_recall: 0.6830 - 1s/epoch - 3ms/step
Epoch 30/100
558/558 - 1s - loss: 0.6316 - recall: 0.6746 - val_loss: 0.5312 - val_recall: 0.6810 - 1s/epoch - 3ms/step
Epoch 31/100
558/558 - 2s - loss: 0.6262 - recall: 0.6766 - val_loss: 0.5328 - val_recall: 0.6851 - 2s/epoch - 3ms/step
Epoch 32/100
558/558 - 1s - loss: 0.6200 - recall: 0.6889 - val_loss: 0.5382 - val_recall: 0.6973 - 1s/epoch - 3ms/step
Epoch 33/100
558/558 - 2s - loss: 0.6220 - recall: 0.6732 - val_loss: 0.5313 - val_recall: 0.6851 - 2s/epoch - 3ms/step
Epoch 34/100
558/558 - 2s - loss: 0.6257 - recall: 0.6800 - val_loss: 0.5289 - val_recall: 0.6851 - 2s/epoch - 3ms/step
Epoch 35/100
558/558 - 2s - loss: 0.6255 - recall: 0.6791 - val_loss: 0.5361 - val_recall: 0.6892 - 2s/epoch - 4ms/step
Epoch 36/100
558/558 - 2s - loss: 0.6265 - recall: 0.6748 - val_loss: 0.5348 - val_recall: 0.7014 - 2s/epoch - 3ms/step
Epoch 37/100
558/558 - 2s - loss: 0.6173 - recall: 0.6842 - val_loss: 0.5268 - val_recall: 0.6892 - 2s/epoch - 3ms/step
Epoch 38/100
558/558 - 1s - loss: 0.6209 - recall: 0.6809 - val_loss: 0.5344 - val_recall: 0.6892 - 1s/epoch - 3ms/step
Epoch 39/100
558/558 - 1s - loss: 0.6131 - recall: 0.6829 - val_loss: 0.5251 - val_recall: 0.6871 - 1s/epoch - 3ms/step
Epoch 40/100
558/558 - 2s - loss: 0.6177 - recall: 0.6901 - val_loss: 0.5255 - val_recall: 0.6933 - 2s/epoch - 3ms/step
Epoch 41/100
558/558 - 2s - loss: 0.6141 - recall: 0.6907 - val_loss: 0.5161 - val_recall: 0.6830 - 2s/epoch - 3ms/step
Epoch 42/100
558/558 - 2s - loss: 0.6126 - recall: 0.6836 - val_loss: 0.5290 - val_recall: 0.6973 - 2s/epoch - 3ms/step
Epoch 43/100
558/558 - 2s - loss: 0.6075 - recall: 0.6912 - val_loss: 0.5280 - val_recall: 0.6953 - 2s/epoch - 4ms/step
Epoch 44/100
558/558 - 2s - loss: 0.6078 - recall: 0.6845 - val_loss: 0.5191 - val_recall: 0.6851 - 2s/epoch - 4ms/step
Epoch 45/100
558/558 - 2s - loss: 0.6096 - recall: 0.6919 - val_loss: 0.5148 - val_recall: 0.6871 - 2s/epoch - 3ms/step
Epoch 46/100
558/558 - 2s - loss: 0.6076 - recall: 0.6851 - val_loss: 0.5239 - val_recall: 0.6953 - 2s/epoch - 3ms/step
Epoch 47/100
558/558 - 2s - loss: 0.6078 - recall: 0.6892 - val_loss: 0.5191 - val_recall: 0.6953 - 2s/epoch - 3ms/step
Epoch 48/100
558/558 - 2s - loss: 0.6109 - recall: 0.6907 - val_loss: 0.5183 - val_recall: 0.6973 - 2s/epoch - 3ms/step
Epoch 49/100
558/558 - 1s - loss: 0.6041 - recall: 0.6912 - val_loss: 0.5153 - val_recall: 0.6912 - 1s/epoch - 3ms/step
Epoch 50/100
558/558 - 2s - loss: 0.6024 - recall: 0.6903 - val_loss: 0.5123 - val_recall: 0.6933 - 2s/epoch - 3ms/step
Epoch 51/100
558/558 - 2s - loss: 0.6025 - recall: 0.6898 - val_loss: 0.5247 - val_recall: 0.7117 - 2s/epoch - 3ms/step
Epoch 52/100
558/558 - 3s - loss: 0.6009 - recall: 0.6871 - val_loss: 0.5208 - val_recall: 0.7014 - 3s/epoch - 5ms/step
Epoch 53/100
558/558 - 2s - loss: 0.5987 - recall: 0.6981 - val_loss: 0.5157 - val_recall: 0.6953 - 2s/epoch - 3ms/step
Epoch 54/100
558/558 - 2s - loss: 0.6014 - recall: 0.6892 - val_loss: 0.5085 - val_recall: 0.6994 - 2s/epoch - 3ms/step
Epoch 55/100
558/558 - 1s - loss: 0.5931 - recall: 0.6943 - val_loss: 0.5087 - val_recall: 0.6933 - 1s/epoch - 3ms/step
Epoch 56/100
558/558 - 2s - loss: 0.5954 - recall: 0.6905 - val_loss: 0.5226 - val_recall: 0.7157 - 2s/epoch - 3ms/step
Epoch 57/100
558/558 - 2s - loss: 0.5918 - recall: 0.6941 - val_loss: 0.5104 - val_recall: 0.6953 - 2s/epoch - 3ms/step
Epoch 58/100
558/558 - 2s - loss: 0.5939 - recall: 0.7011 - val_loss: 0.5073 - val_recall: 0.6994 - 2s/epoch - 3ms/step
Epoch 59/100
558/558 - 2s - loss: 0.5948 - recall: 0.6923 - val_loss: 0.5119 - val_recall: 0.7076 - 2s/epoch - 4ms/step
Epoch 60/100
558/558 - 2s - loss: 0.5814 - recall: 0.7098 - val_loss: 0.5118 - val_recall: 0.7055 - 2s/epoch - 4ms/step
Epoch 61/100
558/558 - 2s - loss: 0.5893 - recall: 0.6959 - val_loss: 0.5097 - val_recall: 0.7117 - 2s/epoch - 4ms/step
Epoch 62/100
558/558 - 2s - loss: 0.5878 - recall: 0.6954 - val_loss: 0.5142 - val_recall: 0.7137 - 2s/epoch - 3ms/step
Epoch 63/100
558/558 - 1s - loss: 0.5873 - recall: 0.6950 - val_loss: 0.5134 - val_recall: 0.7137 - 1s/epoch - 3ms/step
Epoch 64/100
558/558 - 2s - loss: 0.5836 - recall: 0.7060 - val_loss: 0.5168 - val_recall: 0.7260 - 2s/epoch - 3ms/step
Epoch 65/100
558/558 - 2s - loss: 0.5928 - recall: 0.6957 - val_loss: 0.5160 - val_recall: 0.7198 - 2s/epoch - 3ms/step
Epoch 66/100
558/558 - 1s - loss: 0.5846 - recall: 0.7042 - val_loss: 0.5062 - val_recall: 0.7076 - 1s/epoch - 3ms/step
Epoch 67/100
558/558 - 1s - loss: 0.5817 - recall: 0.7040 - val_loss: 0.5118 - val_recall: 0.7219 - 1s/epoch - 3ms/step
Epoch 68/100
558/558 - 2s - loss: 0.5859 - recall: 0.7015 - val_loss: 0.4957 - val_recall: 0.7014 - 2s/epoch - 4ms/step
Epoch 69/100
558/558 - 3s - loss: 0.5784 - recall: 0.7064 - val_loss: 0.5029 - val_recall: 0.7055 - 3s/epoch - 5ms/step
Epoch 70/100
558/558 - 2s - loss: 0.5837 - recall: 0.6999 - val_loss: 0.5100 - val_recall: 0.7178 - 2s/epoch - 3ms/step
Epoch 71/100
558/558 - 1s - loss: 0.5765 - recall: 0.7062 - val_loss: 0.5140 - val_recall: 0.7219 - 1s/epoch - 3ms/step
Epoch 72/100
558/558 - 1s - loss: 0.5758 - recall: 0.7134 - val_loss: 0.5033 - val_recall: 0.7157 - 1s/epoch - 3ms/step
Epoch 73/100
558/558 - 2s - loss: 0.5813 - recall: 0.7053 - val_loss: 0.5050 - val_recall: 0.7178 - 2s/epoch - 3ms/step
Epoch 74/100
558/558 - 1s - loss: 0.5707 - recall: 0.7123 - val_loss: 0.5022 - val_recall: 0.7117 - 1s/epoch - 3ms/step
Epoch 75/100
558/558 - 2s - loss: 0.5738 - recall: 0.7073 - val_loss: 0.5004 - val_recall: 0.7096 - 2s/epoch - 3ms/step
Epoch 76/100
558/558 - 2s - loss: 0.5789 - recall: 0.7028 - val_loss: 0.5055 - val_recall: 0.7198 - 2s/epoch - 3ms/step
Epoch 77/100
558/558 - 2s - loss: 0.5635 - recall: 0.7230 - val_loss: 0.5021 - val_recall: 0.7137 - 2s/epoch - 4ms/step
Epoch 78/100
558/558 - 2s - loss: 0.5737 - recall: 0.7087 - val_loss: 0.5076 - val_recall: 0.7178 - 2s/epoch - 4ms/step
Epoch 79/100
558/558 - 2s - loss: 0.5748 - recall: 0.7145 - val_loss: 0.5034 - val_recall: 0.7178 - 2s/epoch - 3ms/step
Epoch 80/100
558/558 - 1s - loss: 0.5684 - recall: 0.7076 - val_loss: 0.5095 - val_recall: 0.7280 - 1s/epoch - 3ms/step
Epoch 81/100
558/558 - 1s - loss: 0.5706 - recall: 0.7098 - val_loss: 0.4990 - val_recall: 0.7117 - 1s/epoch - 3ms/step
Epoch 82/100
558/558 - 1s - loss: 0.5651 - recall: 0.7123 - val_loss: 0.5124 - val_recall: 0.7301 - 1s/epoch - 3ms/step
Epoch 83/100
558/558 - 2s - loss: 0.5718 - recall: 0.7040 - val_loss: 0.5022 - val_recall: 0.7178 - 2s/epoch - 3ms/step
Epoch 84/100
558/558 - 1s - loss: 0.5685 - recall: 0.7067 - val_loss: 0.4988 - val_recall: 0.7137 - 1s/epoch - 3ms/step
Epoch 85/100
558/558 - 1s - loss: 0.5684 - recall: 0.7170 - val_loss: 0.4961 - val_recall: 0.7035 - 1s/epoch - 3ms/step
Epoch 86/100
558/558 - 2s - loss: 0.5641 - recall: 0.7219 - val_loss: 0.4981 - val_recall: 0.7096 - 2s/epoch - 4ms/step
Epoch 87/100
558/558 - 2s - loss: 0.5637 - recall: 0.7080 - val_loss: 0.5012 - val_recall: 0.7157 - 2s/epoch - 4ms/step
Epoch 88/100
558/558 - 2s - loss: 0.5685 - recall: 0.7172 - val_loss: 0.5051 - val_recall: 0.7280 - 2s/epoch - 3ms/step
Epoch 89/100
558/558 - 2s - loss: 0.5665 - recall: 0.7185 - val_loss: 0.4955 - val_recall: 0.7117 - 2s/epoch - 3ms/step
Epoch 90/100
558/558 - 2s - loss: 0.5680 - recall: 0.7080 - val_loss: 0.5025 - val_recall: 0.7198 - 2s/epoch - 3ms/step
Epoch 91/100
558/558 - 2s - loss: 0.5669 - recall: 0.7163 - val_loss: 0.5004 - val_recall: 0.7178 - 2s/epoch - 3ms/step
Epoch 92/100
558/558 - 2s - loss: 0.5592 - recall: 0.7237 - val_loss: 0.4971 - val_recall: 0.7117 - 2s/epoch - 3ms/step
Epoch 93/100
558/558 - 2s - loss: 0.5544 - recall: 0.7156 - val_loss: 0.4914 - val_recall: 0.7076 - 2s/epoch - 3ms/step
Epoch 94/100
558/558 - 2s - loss: 0.5527 - recall: 0.7221 - val_loss: 0.5068 - val_recall: 0.7219 - 2s/epoch - 3ms/step
Epoch 95/100
558/558 - 2s - loss: 0.5624 - recall: 0.7046 - val_loss: 0.4920 - val_recall: 0.7096 - 2s/epoch - 4ms/step
Epoch 96/100
558/558 - 2s - loss: 0.5540 - recall: 0.7221 - val_loss: 0.5003 - val_recall: 0.7178 - 2s/epoch - 4ms/step
Epoch 97/100
558/558 - 2s - loss: 0.5559 - recall: 0.7210 - val_loss: 0.4954 - val_recall: 0.7157 - 2s/epoch - 3ms/step
Epoch 98/100
558/558 - 2s - loss: 0.5666 - recall: 0.7129 - val_loss: 0.4982 - val_recall: 0.7178 - 2s/epoch - 3ms/step
Epoch 99/100
558/558 - 2s - loss: 0.5517 - recall: 0.7206 - val_loss: 0.4966 - val_recall: 0.7137 - 2s/epoch - 3ms/step
Epoch 100/100
558/558 - 2s - loss: 0.5481 - recall: 0.7206 - val_loss: 0.4895 - val_recall: 0.7076 - 2s/epoch - 3ms/step
plot(results.iloc[i]['history'],'loss')
plot(results.iloc[i]['history'],'recall')
results.iloc[i]
# hidden layers 2 # neurons - hidden layer [128, 128, 64] activation function - hidden layer [relu, relu] # epochs 100 batch size 16 optimizer Adam learning rate, momentum, dropout [1e-05, 0.0, 0.2] weight initializer he_uniform regularization - train loss 0.548106 validation loss 0.489455 train recall 0.720565 validation recall 0.707566 time (secs) 174.82 model <keras.src.engine.sequential.Sequential object... history <keras.src.callbacks.History object at 0x7dfcd... Name: 11, dtype: object
The Neural Network build using ADAM with dropout of 20% was the best fit as the loss was less and the recall scores generated for training anf validation data are close to one another.
train recall 0.720565 & validation recall 0.707566
This one has the best recall scores and is the best one out of all models
i+=1
model_fit_with_dropout('relu','relu','Adam',X_train_over,y_train_over,100,i,dropoutval=0.3,learning_rte=1e-5)
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 128) 1536
dropout (Dropout) (None, 128) 0
dense_1 (Dense) (None, 128) 16512
batch_normalization (Batch (None, 128) 512
Normalization)
dropout_1 (Dropout) (None, 128) 0
dense_2 (Dense) (None, 64) 8256
batch_normalization_1 (Bat (None, 64) 256
chNormalization)
dropout_2 (Dropout) (None, 64) 0
dense_3 (Dense) (None, 1) 65
=================================================================
Total params: 27137 (106.00 KB)
Trainable params: 26753 (104.50 KB)
Non-trainable params: 384 (1.50 KB)
_________________________________________________________________
Epoch 1/100
558/558 - 4s - loss: 1.0154 - recall: 0.4889 - val_loss: 0.7185 - val_recall: 0.3681 - 4s/epoch - 7ms/step
Epoch 2/100
558/558 - 2s - loss: 0.9628 - recall: 0.5093 - val_loss: 0.6742 - val_recall: 0.4070 - 2s/epoch - 4ms/step
Epoch 3/100
558/558 - 2s - loss: 0.9131 - recall: 0.5232 - val_loss: 0.6466 - val_recall: 0.4254 - 2s/epoch - 4ms/step
Epoch 4/100
558/558 - 2s - loss: 0.8791 - recall: 0.5474 - val_loss: 0.6322 - val_recall: 0.4806 - 2s/epoch - 3ms/step
Epoch 5/100
558/558 - 2s - loss: 0.8553 - recall: 0.5503 - val_loss: 0.6147 - val_recall: 0.5092 - 2s/epoch - 3ms/step
Epoch 6/100
558/558 - 2s - loss: 0.8248 - recall: 0.5687 - val_loss: 0.5966 - val_recall: 0.5215 - 2s/epoch - 3ms/step
Epoch 7/100
558/558 - 1s - loss: 0.8138 - recall: 0.5831 - val_loss: 0.5821 - val_recall: 0.5317 - 1s/epoch - 3ms/step
Epoch 8/100
558/558 - 2s - loss: 0.7850 - recall: 0.5909 - val_loss: 0.5796 - val_recall: 0.5603 - 2s/epoch - 3ms/step
Epoch 9/100
558/558 - 2s - loss: 0.7866 - recall: 0.5858 - val_loss: 0.5705 - val_recall: 0.5685 - 2s/epoch - 3ms/step
Epoch 10/100
558/558 - 2s - loss: 0.7834 - recall: 0.5945 - val_loss: 0.5686 - val_recall: 0.5787 - 2s/epoch - 4ms/step
Epoch 11/100
558/558 - 2s - loss: 0.7659 - recall: 0.6019 - val_loss: 0.5675 - val_recall: 0.5910 - 2s/epoch - 4ms/step
Epoch 12/100
558/558 - 2s - loss: 0.7462 - recall: 0.6165 - val_loss: 0.5584 - val_recall: 0.5890 - 2s/epoch - 3ms/step
Epoch 13/100
558/558 - 2s - loss: 0.7384 - recall: 0.6120 - val_loss: 0.5545 - val_recall: 0.5910 - 2s/epoch - 3ms/step
Epoch 14/100
558/558 - 2s - loss: 0.7409 - recall: 0.6152 - val_loss: 0.5667 - val_recall: 0.6176 - 2s/epoch - 3ms/step
Epoch 15/100
558/558 - 2s - loss: 0.7294 - recall: 0.6297 - val_loss: 0.5607 - val_recall: 0.6155 - 2s/epoch - 3ms/step
Epoch 16/100
558/558 - 1s - loss: 0.7287 - recall: 0.6212 - val_loss: 0.5613 - val_recall: 0.6237 - 1s/epoch - 3ms/step
Epoch 17/100
558/558 - 1s - loss: 0.7055 - recall: 0.6304 - val_loss: 0.5589 - val_recall: 0.6339 - 1s/epoch - 3ms/step
Epoch 18/100
558/558 - 2s - loss: 0.7170 - recall: 0.6315 - val_loss: 0.5569 - val_recall: 0.6360 - 2s/epoch - 3ms/step
Epoch 19/100
558/558 - 2s - loss: 0.7148 - recall: 0.6266 - val_loss: 0.5571 - val_recall: 0.6421 - 2s/epoch - 4ms/step
Epoch 20/100
558/558 - 3s - loss: 0.6988 - recall: 0.6441 - val_loss: 0.5426 - val_recall: 0.6360 - 3s/epoch - 5ms/step
Epoch 21/100
558/558 - 2s - loss: 0.7017 - recall: 0.6246 - val_loss: 0.5531 - val_recall: 0.6442 - 2s/epoch - 3ms/step
Epoch 22/100
558/558 - 2s - loss: 0.6985 - recall: 0.6277 - val_loss: 0.5499 - val_recall: 0.6503 - 2s/epoch - 3ms/step
Epoch 23/100
558/558 - 2s - loss: 0.6912 - recall: 0.6461 - val_loss: 0.5531 - val_recall: 0.6544 - 2s/epoch - 3ms/step
Epoch 24/100
558/558 - 2s - loss: 0.6889 - recall: 0.6470 - val_loss: 0.5451 - val_recall: 0.6483 - 2s/epoch - 3ms/step
Epoch 25/100
558/558 - 1s - loss: 0.6917 - recall: 0.6475 - val_loss: 0.5480 - val_recall: 0.6564 - 1s/epoch - 3ms/step
Epoch 26/100
558/558 - 2s - loss: 0.6920 - recall: 0.6461 - val_loss: 0.5369 - val_recall: 0.6462 - 2s/epoch - 3ms/step
Epoch 27/100
558/558 - 2s - loss: 0.6955 - recall: 0.6344 - val_loss: 0.5433 - val_recall: 0.6646 - 2s/epoch - 4ms/step
Epoch 28/100
558/558 - 2s - loss: 0.6848 - recall: 0.6454 - val_loss: 0.5413 - val_recall: 0.6646 - 2s/epoch - 4ms/step
Epoch 29/100
558/558 - 2s - loss: 0.6819 - recall: 0.6369 - val_loss: 0.5403 - val_recall: 0.6667 - 2s/epoch - 4ms/step
Epoch 30/100
558/558 - 1s - loss: 0.6838 - recall: 0.6394 - val_loss: 0.5426 - val_recall: 0.6708 - 1s/epoch - 3ms/step
Epoch 31/100
558/558 - 1s - loss: 0.6713 - recall: 0.6515 - val_loss: 0.5425 - val_recall: 0.6728 - 1s/epoch - 3ms/step
Epoch 32/100
558/558 - 2s - loss: 0.6772 - recall: 0.6448 - val_loss: 0.5459 - val_recall: 0.6851 - 2s/epoch - 3ms/step
Epoch 33/100
558/558 - 1s - loss: 0.6700 - recall: 0.6533 - val_loss: 0.5383 - val_recall: 0.6728 - 1s/epoch - 3ms/step
Epoch 34/100
558/558 - 1s - loss: 0.6765 - recall: 0.6553 - val_loss: 0.5395 - val_recall: 0.6708 - 1s/epoch - 3ms/step
Epoch 35/100
558/558 - 1s - loss: 0.6782 - recall: 0.6472 - val_loss: 0.5462 - val_recall: 0.6810 - 1s/epoch - 3ms/step
Epoch 36/100
558/558 - 2s - loss: 0.6692 - recall: 0.6537 - val_loss: 0.5465 - val_recall: 0.6871 - 2s/epoch - 4ms/step
Epoch 37/100
558/558 - 3s - loss: 0.6602 - recall: 0.6600 - val_loss: 0.5385 - val_recall: 0.6769 - 3s/epoch - 5ms/step
Epoch 38/100
558/558 - 2s - loss: 0.6704 - recall: 0.6488 - val_loss: 0.5412 - val_recall: 0.6789 - 2s/epoch - 3ms/step
Epoch 39/100
558/558 - 2s - loss: 0.6571 - recall: 0.6533 - val_loss: 0.5371 - val_recall: 0.6830 - 2s/epoch - 3ms/step
Epoch 40/100
558/558 - 2s - loss: 0.6638 - recall: 0.6632 - val_loss: 0.5371 - val_recall: 0.6810 - 2s/epoch - 3ms/step
Epoch 41/100
558/558 - 2s - loss: 0.6625 - recall: 0.6625 - val_loss: 0.5301 - val_recall: 0.6769 - 2s/epoch - 3ms/step
Epoch 42/100
558/558 - 2s - loss: 0.6620 - recall: 0.6504 - val_loss: 0.5413 - val_recall: 0.6912 - 2s/epoch - 3ms/step
Epoch 43/100
558/558 - 2s - loss: 0.6507 - recall: 0.6681 - val_loss: 0.5404 - val_recall: 0.6892 - 2s/epoch - 3ms/step
Epoch 44/100
558/558 - 2s - loss: 0.6472 - recall: 0.6643 - val_loss: 0.5306 - val_recall: 0.6789 - 2s/epoch - 3ms/step
Epoch 45/100
558/558 - 3s - loss: 0.6584 - recall: 0.6587 - val_loss: 0.5303 - val_recall: 0.6830 - 3s/epoch - 5ms/step
Epoch 46/100
558/558 - 2s - loss: 0.6541 - recall: 0.6546 - val_loss: 0.5377 - val_recall: 0.6912 - 2s/epoch - 3ms/step
Epoch 47/100
558/558 - 1s - loss: 0.6529 - recall: 0.6573 - val_loss: 0.5338 - val_recall: 0.6871 - 1s/epoch - 3ms/step
Epoch 48/100
558/558 - 2s - loss: 0.6524 - recall: 0.6600 - val_loss: 0.5352 - val_recall: 0.6871 - 2s/epoch - 3ms/step
Epoch 49/100
558/558 - 2s - loss: 0.6473 - recall: 0.6625 - val_loss: 0.5339 - val_recall: 0.6912 - 2s/epoch - 3ms/step
Epoch 50/100
558/558 - 2s - loss: 0.6460 - recall: 0.6616 - val_loss: 0.5326 - val_recall: 0.6892 - 2s/epoch - 3ms/step
Epoch 51/100
558/558 - 1s - loss: 0.6452 - recall: 0.6623 - val_loss: 0.5380 - val_recall: 0.6912 - 1s/epoch - 3ms/step
Epoch 52/100
558/558 - 2s - loss: 0.6451 - recall: 0.6587 - val_loss: 0.5403 - val_recall: 0.6933 - 2s/epoch - 3ms/step
Epoch 53/100
558/558 - 2s - loss: 0.6452 - recall: 0.6658 - val_loss: 0.5386 - val_recall: 0.6953 - 2s/epoch - 4ms/step
Epoch 54/100
558/558 - 2s - loss: 0.6384 - recall: 0.6584 - val_loss: 0.5246 - val_recall: 0.6871 - 2s/epoch - 4ms/step
Epoch 55/100
558/558 - 2s - loss: 0.6353 - recall: 0.6672 - val_loss: 0.5317 - val_recall: 0.6933 - 2s/epoch - 3ms/step
Epoch 56/100
558/558 - 1s - loss: 0.6388 - recall: 0.6663 - val_loss: 0.5406 - val_recall: 0.7076 - 1s/epoch - 3ms/step
Epoch 57/100
558/558 - 2s - loss: 0.6401 - recall: 0.6679 - val_loss: 0.5283 - val_recall: 0.6912 - 2s/epoch - 3ms/step
Epoch 58/100
558/558 - 1s - loss: 0.6350 - recall: 0.6667 - val_loss: 0.5243 - val_recall: 0.6912 - 1s/epoch - 3ms/step
Epoch 59/100
558/558 - 1s - loss: 0.6310 - recall: 0.6764 - val_loss: 0.5332 - val_recall: 0.6994 - 1s/epoch - 3ms/step
Epoch 60/100
558/558 - 2s - loss: 0.6326 - recall: 0.6652 - val_loss: 0.5351 - val_recall: 0.7055 - 2s/epoch - 3ms/step
Epoch 61/100
558/558 - 2s - loss: 0.6275 - recall: 0.6715 - val_loss: 0.5287 - val_recall: 0.6973 - 2s/epoch - 3ms/step
Epoch 62/100
558/558 - 2s - loss: 0.6358 - recall: 0.6641 - val_loss: 0.5364 - val_recall: 0.7117 - 2s/epoch - 4ms/step
Epoch 63/100
558/558 - 2s - loss: 0.6268 - recall: 0.6699 - val_loss: 0.5312 - val_recall: 0.7035 - 2s/epoch - 4ms/step
Epoch 64/100
558/558 - 2s - loss: 0.6236 - recall: 0.6750 - val_loss: 0.5356 - val_recall: 0.7157 - 2s/epoch - 3ms/step
Epoch 65/100
558/558 - 2s - loss: 0.6332 - recall: 0.6685 - val_loss: 0.5375 - val_recall: 0.7198 - 2s/epoch - 3ms/step
Epoch 66/100
558/558 - 2s - loss: 0.6234 - recall: 0.6732 - val_loss: 0.5309 - val_recall: 0.7137 - 2s/epoch - 3ms/step
Epoch 67/100
558/558 - 2s - loss: 0.6235 - recall: 0.6822 - val_loss: 0.5358 - val_recall: 0.7219 - 2s/epoch - 3ms/step
Epoch 68/100
558/558 - 2s - loss: 0.6273 - recall: 0.6766 - val_loss: 0.5151 - val_recall: 0.6912 - 2s/epoch - 3ms/step
Epoch 69/100
558/558 - 2s - loss: 0.6105 - recall: 0.6865 - val_loss: 0.5253 - val_recall: 0.7055 - 2s/epoch - 3ms/step
Epoch 70/100
558/558 - 2s - loss: 0.6258 - recall: 0.6748 - val_loss: 0.5388 - val_recall: 0.7301 - 2s/epoch - 4ms/step
Epoch 71/100
558/558 - 2s - loss: 0.6211 - recall: 0.6773 - val_loss: 0.5331 - val_recall: 0.7198 - 2s/epoch - 4ms/step
Epoch 72/100
558/558 - 2s - loss: 0.6167 - recall: 0.6741 - val_loss: 0.5276 - val_recall: 0.7157 - 2s/epoch - 3ms/step
Epoch 73/100
558/558 - 2s - loss: 0.6233 - recall: 0.6735 - val_loss: 0.5307 - val_recall: 0.7198 - 2s/epoch - 3ms/step
Epoch 74/100
558/558 - 2s - loss: 0.6095 - recall: 0.6928 - val_loss: 0.5289 - val_recall: 0.7198 - 2s/epoch - 3ms/step
Epoch 75/100
558/558 - 1s - loss: 0.6100 - recall: 0.6833 - val_loss: 0.5245 - val_recall: 0.7137 - 1s/epoch - 3ms/step
Epoch 76/100
558/558 - 2s - loss: 0.6232 - recall: 0.6661 - val_loss: 0.5297 - val_recall: 0.7178 - 2s/epoch - 3ms/step
Epoch 77/100
558/558 - 1s - loss: 0.6066 - recall: 0.6923 - val_loss: 0.5265 - val_recall: 0.7178 - 1s/epoch - 3ms/step
Epoch 78/100
558/558 - 1s - loss: 0.6095 - recall: 0.6858 - val_loss: 0.5331 - val_recall: 0.7280 - 1s/epoch - 3ms/step
Epoch 79/100
558/558 - 2s - loss: 0.6147 - recall: 0.6782 - val_loss: 0.5306 - val_recall: 0.7219 - 2s/epoch - 4ms/step
Epoch 80/100
558/558 - 2s - loss: 0.6131 - recall: 0.6768 - val_loss: 0.5367 - val_recall: 0.7301 - 2s/epoch - 4ms/step
Epoch 81/100
558/558 - 2s - loss: 0.6124 - recall: 0.6759 - val_loss: 0.5251 - val_recall: 0.7219 - 2s/epoch - 3ms/step
Epoch 82/100
558/558 - 2s - loss: 0.6020 - recall: 0.6910 - val_loss: 0.5382 - val_recall: 0.7280 - 2s/epoch - 3ms/step
Epoch 83/100
558/558 - 2s - loss: 0.6094 - recall: 0.6793 - val_loss: 0.5247 - val_recall: 0.7198 - 2s/epoch - 3ms/step
Epoch 84/100
558/558 - 1s - loss: 0.6121 - recall: 0.6748 - val_loss: 0.5221 - val_recall: 0.7137 - 1s/epoch - 3ms/step
Epoch 85/100
558/558 - 1s - loss: 0.6116 - recall: 0.6847 - val_loss: 0.5231 - val_recall: 0.7157 - 1s/epoch - 3ms/step
Epoch 86/100
558/558 - 2s - loss: 0.6081 - recall: 0.6878 - val_loss: 0.5284 - val_recall: 0.7260 - 2s/epoch - 3ms/step
Epoch 87/100
558/558 - 2s - loss: 0.6102 - recall: 0.6797 - val_loss: 0.5281 - val_recall: 0.7280 - 2s/epoch - 3ms/step
Epoch 88/100
558/558 - 2s - loss: 0.6086 - recall: 0.6854 - val_loss: 0.5328 - val_recall: 0.7301 - 2s/epoch - 4ms/step
Epoch 89/100
558/558 - 2s - loss: 0.6089 - recall: 0.6820 - val_loss: 0.5213 - val_recall: 0.7137 - 2s/epoch - 4ms/step
Epoch 90/100
558/558 - 2s - loss: 0.6084 - recall: 0.6863 - val_loss: 0.5280 - val_recall: 0.7239 - 2s/epoch - 3ms/step
Epoch 91/100
558/558 - 2s - loss: 0.6025 - recall: 0.6842 - val_loss: 0.5308 - val_recall: 0.7321 - 2s/epoch - 3ms/step
Epoch 92/100
558/558 - 1s - loss: 0.6032 - recall: 0.6831 - val_loss: 0.5265 - val_recall: 0.7260 - 1s/epoch - 3ms/step
Epoch 93/100
558/558 - 2s - loss: 0.5929 - recall: 0.6963 - val_loss: 0.5222 - val_recall: 0.7198 - 2s/epoch - 3ms/step
Epoch 94/100
558/558 - 2s - loss: 0.5970 - recall: 0.6822 - val_loss: 0.5334 - val_recall: 0.7321 - 2s/epoch - 3ms/step
Epoch 95/100
558/558 - 2s - loss: 0.6062 - recall: 0.6806 - val_loss: 0.5246 - val_recall: 0.7301 - 2s/epoch - 3ms/step
Epoch 96/100
558/558 - 2s - loss: 0.6004 - recall: 0.6863 - val_loss: 0.5294 - val_recall: 0.7321 - 2s/epoch - 4ms/step
Epoch 97/100
558/558 - 3s - loss: 0.5937 - recall: 0.6937 - val_loss: 0.5234 - val_recall: 0.7301 - 3s/epoch - 5ms/step
Epoch 98/100
558/558 - 2s - loss: 0.6021 - recall: 0.6894 - val_loss: 0.5278 - val_recall: 0.7301 - 2s/epoch - 3ms/step
Epoch 99/100
558/558 - 1s - loss: 0.5967 - recall: 0.6896 - val_loss: 0.5263 - val_recall: 0.7301 - 1s/epoch - 3ms/step
Epoch 100/100
558/558 - 2s - loss: 0.5981 - recall: 0.6894 - val_loss: 0.5192 - val_recall: 0.7280 - 2s/epoch - 3ms/step
plot(results.iloc[i]['history'],'loss')
plot(results.iloc[i]['history'],'recall')
results.iloc[i]
# hidden layers 2 # neurons - hidden layer [128, 128, 64] activation function - hidden layer [relu, relu] # epochs 100 batch size 16 optimizer Adam learning rate, momentum, dropout [1e-05, 0.0, 0.3] weight initializer he_uniform regularization - train loss 0.598142 validation loss 0.519198 train recall 0.689392 validation recall 0.728016 time (secs) 174.72 model <keras.src.engine.sequential.Sequential object... history <keras.src.callbacks.History object at 0x7dfcd... Name: 12, dtype: object
The Neural Network build using ADAM with dropout of 30% has recall score as mentioned below
train recall 0.689392 & validation recall 0.728016
The loss is reduced with training and the loss for training data and validation data are following the same path.
i+=1
model_fit_with_dropout('relu','relu','SGD-Mom',X_train_over,y_train_over,100,i,dropoutval=0.2,learning_rte=1e-5)
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 128) 1536
dropout (Dropout) (None, 128) 0
dense_1 (Dense) (None, 128) 16512
batch_normalization (Batch (None, 128) 512
Normalization)
dropout_1 (Dropout) (None, 128) 0
dense_2 (Dense) (None, 64) 8256
batch_normalization_1 (Bat (None, 64) 256
chNormalization)
dropout_2 (Dropout) (None, 64) 0
dense_3 (Dense) (None, 1) 65
=================================================================
Total params: 27137 (106.00 KB)
Trainable params: 26753 (104.50 KB)
Non-trainable params: 384 (1.50 KB)
_________________________________________________________________
Epoch 1/100
558/558 - 3s - loss: 1.0353 - recall: 0.4772 - val_loss: 0.8657 - val_recall: 0.3824 - 3s/epoch - 5ms/step
Epoch 2/100
558/558 - 1s - loss: 1.0327 - recall: 0.4725 - val_loss: 0.8577 - val_recall: 0.3824 - 1s/epoch - 2ms/step
Epoch 3/100
558/558 - 2s - loss: 1.0284 - recall: 0.4676 - val_loss: 0.8538 - val_recall: 0.3865 - 2s/epoch - 3ms/step
Epoch 4/100
558/558 - 2s - loss: 1.0394 - recall: 0.4759 - val_loss: 0.8611 - val_recall: 0.4029 - 2s/epoch - 3ms/step
Epoch 5/100
558/558 - 2s - loss: 1.0228 - recall: 0.4741 - val_loss: 0.8541 - val_recall: 0.4090 - 2s/epoch - 4ms/step
Epoch 6/100
558/558 - 1s - loss: 1.0152 - recall: 0.4732 - val_loss: 0.8505 - val_recall: 0.4131 - 1s/epoch - 3ms/step
Epoch 7/100
558/558 - 1s - loss: 1.0138 - recall: 0.4799 - val_loss: 0.8314 - val_recall: 0.3988 - 1s/epoch - 2ms/step
Epoch 8/100
558/558 - 1s - loss: 1.0091 - recall: 0.4752 - val_loss: 0.8275 - val_recall: 0.3947 - 1s/epoch - 3ms/step
Epoch 9/100
558/558 - 1s - loss: 0.9959 - recall: 0.4905 - val_loss: 0.8316 - val_recall: 0.4008 - 1s/epoch - 3ms/step
Epoch 10/100
558/558 - 2s - loss: 0.9939 - recall: 0.4923 - val_loss: 0.8241 - val_recall: 0.4110 - 2s/epoch - 3ms/step
Epoch 11/100
558/558 - 1s - loss: 0.9842 - recall: 0.4945 - val_loss: 0.8333 - val_recall: 0.4233 - 1s/epoch - 3ms/step
Epoch 12/100
558/558 - 1s - loss: 0.9780 - recall: 0.4954 - val_loss: 0.8128 - val_recall: 0.4049 - 1s/epoch - 3ms/step
Epoch 13/100
558/558 - 2s - loss: 0.9566 - recall: 0.5037 - val_loss: 0.8055 - val_recall: 0.4049 - 2s/epoch - 4ms/step
Epoch 14/100
558/558 - 2s - loss: 0.9702 - recall: 0.4956 - val_loss: 0.8221 - val_recall: 0.4417 - 2s/epoch - 4ms/step
Epoch 15/100
558/558 - 1s - loss: 0.9597 - recall: 0.5012 - val_loss: 0.8022 - val_recall: 0.4070 - 1s/epoch - 3ms/step
Epoch 16/100
558/558 - 1s - loss: 0.9671 - recall: 0.4956 - val_loss: 0.8081 - val_recall: 0.4274 - 1s/epoch - 3ms/step
Epoch 17/100
558/558 - 1s - loss: 0.9509 - recall: 0.4990 - val_loss: 0.7839 - val_recall: 0.4233 - 1s/epoch - 2ms/step
Epoch 18/100
558/558 - 1s - loss: 0.9607 - recall: 0.5006 - val_loss: 0.7883 - val_recall: 0.4335 - 1s/epoch - 3ms/step
Epoch 19/100
558/558 - 1s - loss: 0.9527 - recall: 0.4976 - val_loss: 0.7996 - val_recall: 0.4417 - 1s/epoch - 2ms/step
Epoch 20/100
558/558 - 1s - loss: 0.9363 - recall: 0.5169 - val_loss: 0.7788 - val_recall: 0.4397 - 1s/epoch - 2ms/step
Epoch 21/100
558/558 - 1s - loss: 0.9325 - recall: 0.5059 - val_loss: 0.7845 - val_recall: 0.4458 - 1s/epoch - 3ms/step
Epoch 22/100
558/558 - 2s - loss: 0.9293 - recall: 0.5028 - val_loss: 0.7749 - val_recall: 0.4417 - 2s/epoch - 3ms/step
Epoch 23/100
558/558 - 2s - loss: 0.9279 - recall: 0.5129 - val_loss: 0.7878 - val_recall: 0.4581 - 2s/epoch - 3ms/step
Epoch 24/100
558/558 - 2s - loss: 0.9157 - recall: 0.5221 - val_loss: 0.7760 - val_recall: 0.4622 - 2s/epoch - 4ms/step
Epoch 25/100
558/558 - 1s - loss: 0.9307 - recall: 0.5147 - val_loss: 0.7689 - val_recall: 0.4438 - 1s/epoch - 3ms/step
Epoch 26/100
558/558 - 1s - loss: 0.9206 - recall: 0.5102 - val_loss: 0.7676 - val_recall: 0.4519 - 1s/epoch - 2ms/step
Epoch 27/100
558/558 - 1s - loss: 0.9226 - recall: 0.5127 - val_loss: 0.7582 - val_recall: 0.4499 - 1s/epoch - 3ms/step
Epoch 28/100
558/558 - 1s - loss: 0.9065 - recall: 0.5223 - val_loss: 0.7482 - val_recall: 0.4519 - 1s/epoch - 3ms/step
Epoch 29/100
558/558 - 1s - loss: 0.9156 - recall: 0.5140 - val_loss: 0.7477 - val_recall: 0.4458 - 1s/epoch - 2ms/step
Epoch 30/100
558/558 - 1s - loss: 0.9102 - recall: 0.5165 - val_loss: 0.7470 - val_recall: 0.4519 - 1s/epoch - 2ms/step
Epoch 31/100
558/558 - 1s - loss: 0.8990 - recall: 0.5167 - val_loss: 0.7516 - val_recall: 0.4540 - 1s/epoch - 2ms/step
Epoch 32/100
558/558 - 2s - loss: 0.8888 - recall: 0.5308 - val_loss: 0.7485 - val_recall: 0.4785 - 2s/epoch - 3ms/step
Epoch 33/100
558/558 - 2s - loss: 0.8982 - recall: 0.5207 - val_loss: 0.7453 - val_recall: 0.4765 - 2s/epoch - 4ms/step
Epoch 34/100
558/558 - 2s - loss: 0.8965 - recall: 0.5239 - val_loss: 0.7380 - val_recall: 0.4703 - 2s/epoch - 3ms/step
Epoch 35/100
558/558 - 1s - loss: 0.8894 - recall: 0.5243 - val_loss: 0.7384 - val_recall: 0.4826 - 1s/epoch - 2ms/step
Epoch 36/100
558/558 - 1s - loss: 0.8952 - recall: 0.5145 - val_loss: 0.7304 - val_recall: 0.4663 - 1s/epoch - 3ms/step
Epoch 37/100
558/558 - 1s - loss: 0.8866 - recall: 0.5223 - val_loss: 0.7345 - val_recall: 0.4703 - 1s/epoch - 3ms/step
Epoch 38/100
558/558 - 1s - loss: 0.8843 - recall: 0.5373 - val_loss: 0.7357 - val_recall: 0.4847 - 1s/epoch - 2ms/step
Epoch 39/100
558/558 - 1s - loss: 0.8769 - recall: 0.5299 - val_loss: 0.7331 - val_recall: 0.4703 - 1s/epoch - 3ms/step
Epoch 40/100
558/558 - 1s - loss: 0.8732 - recall: 0.5398 - val_loss: 0.7273 - val_recall: 0.4724 - 1s/epoch - 2ms/step
Epoch 41/100
558/558 - 2s - loss: 0.8855 - recall: 0.5338 - val_loss: 0.7181 - val_recall: 0.4622 - 2s/epoch - 3ms/step
Epoch 42/100
558/558 - 3s - loss: 0.8762 - recall: 0.5396 - val_loss: 0.7245 - val_recall: 0.4826 - 3s/epoch - 5ms/step
Epoch 43/100
558/558 - 2s - loss: 0.8708 - recall: 0.5313 - val_loss: 0.7274 - val_recall: 0.4928 - 2s/epoch - 3ms/step
Epoch 44/100
558/558 - 1s - loss: 0.8656 - recall: 0.5373 - val_loss: 0.7152 - val_recall: 0.4785 - 1s/epoch - 3ms/step
Epoch 45/100
558/558 - 1s - loss: 0.8655 - recall: 0.5367 - val_loss: 0.7062 - val_recall: 0.4724 - 1s/epoch - 3ms/step
Epoch 46/100
558/558 - 1s - loss: 0.8596 - recall: 0.5499 - val_loss: 0.7009 - val_recall: 0.4642 - 1s/epoch - 2ms/step
Epoch 47/100
558/558 - 1s - loss: 0.8623 - recall: 0.5299 - val_loss: 0.7112 - val_recall: 0.4908 - 1s/epoch - 2ms/step
Epoch 48/100
558/558 - 1s - loss: 0.8674 - recall: 0.5416 - val_loss: 0.7104 - val_recall: 0.4990 - 1s/epoch - 2ms/step
Epoch 49/100
558/558 - 1s - loss: 0.8585 - recall: 0.5420 - val_loss: 0.7010 - val_recall: 0.4806 - 1s/epoch - 3ms/step
Epoch 50/100
558/558 - 2s - loss: 0.8548 - recall: 0.5468 - val_loss: 0.6948 - val_recall: 0.4806 - 2s/epoch - 3ms/step
Epoch 51/100
558/558 - 2s - loss: 0.8553 - recall: 0.5497 - val_loss: 0.7116 - val_recall: 0.5215 - 2s/epoch - 3ms/step
Epoch 52/100
558/558 - 2s - loss: 0.8464 - recall: 0.5450 - val_loss: 0.7110 - val_recall: 0.5194 - 2s/epoch - 4ms/step
Epoch 53/100
558/558 - 1s - loss: 0.8474 - recall: 0.5459 - val_loss: 0.6988 - val_recall: 0.5010 - 1s/epoch - 3ms/step
Epoch 54/100
558/558 - 1s - loss: 0.8435 - recall: 0.5616 - val_loss: 0.6840 - val_recall: 0.4867 - 1s/epoch - 3ms/step
Epoch 55/100
558/558 - 1s - loss: 0.8429 - recall: 0.5434 - val_loss: 0.6827 - val_recall: 0.4888 - 1s/epoch - 3ms/step
Epoch 56/100
558/558 - 1s - loss: 0.8475 - recall: 0.5396 - val_loss: 0.6999 - val_recall: 0.5072 - 1s/epoch - 3ms/step
Epoch 57/100
558/558 - 1s - loss: 0.8343 - recall: 0.5521 - val_loss: 0.6922 - val_recall: 0.5072 - 1s/epoch - 2ms/step
Epoch 58/100
558/558 - 1s - loss: 0.8341 - recall: 0.5515 - val_loss: 0.6833 - val_recall: 0.4949 - 1s/epoch - 3ms/step
Epoch 59/100
558/558 - 1s - loss: 0.8387 - recall: 0.5472 - val_loss: 0.6873 - val_recall: 0.5031 - 1s/epoch - 3ms/step
Epoch 60/100
558/558 - 2s - loss: 0.8174 - recall: 0.5665 - val_loss: 0.6852 - val_recall: 0.5092 - 2s/epoch - 3ms/step
Epoch 61/100
558/558 - 2s - loss: 0.8295 - recall: 0.5548 - val_loss: 0.6849 - val_recall: 0.5051 - 2s/epoch - 4ms/step
Epoch 62/100
558/558 - 1s - loss: 0.8260 - recall: 0.5481 - val_loss: 0.6799 - val_recall: 0.5031 - 1s/epoch - 2ms/step
Epoch 63/100
558/558 - 1s - loss: 0.8218 - recall: 0.5553 - val_loss: 0.6807 - val_recall: 0.5072 - 1s/epoch - 3ms/step
Epoch 64/100
558/558 - 1s - loss: 0.8196 - recall: 0.5557 - val_loss: 0.6794 - val_recall: 0.5133 - 1s/epoch - 3ms/step
Epoch 65/100
558/558 - 1s - loss: 0.8311 - recall: 0.5512 - val_loss: 0.6879 - val_recall: 0.5317 - 1s/epoch - 3ms/step
Epoch 66/100
558/558 - 1s - loss: 0.8168 - recall: 0.5595 - val_loss: 0.6800 - val_recall: 0.5235 - 1s/epoch - 2ms/step
Epoch 67/100
558/558 - 1s - loss: 0.8164 - recall: 0.5640 - val_loss: 0.6747 - val_recall: 0.5194 - 1s/epoch - 3ms/step
Epoch 68/100
558/558 - 1s - loss: 0.8161 - recall: 0.5600 - val_loss: 0.6711 - val_recall: 0.5194 - 1s/epoch - 3ms/step
Epoch 69/100
558/558 - 2s - loss: 0.8138 - recall: 0.5620 - val_loss: 0.6717 - val_recall: 0.5235 - 2s/epoch - 3ms/step
Epoch 70/100
558/558 - 2s - loss: 0.8111 - recall: 0.5595 - val_loss: 0.6746 - val_recall: 0.5276 - 2s/epoch - 3ms/step
Epoch 71/100
558/558 - 2s - loss: 0.8037 - recall: 0.5712 - val_loss: 0.6814 - val_recall: 0.5337 - 2s/epoch - 3ms/step
Epoch 72/100
558/558 - 1s - loss: 0.8026 - recall: 0.5770 - val_loss: 0.6703 - val_recall: 0.5256 - 1s/epoch - 3ms/step
Epoch 73/100
558/558 - 1s - loss: 0.8151 - recall: 0.5533 - val_loss: 0.6631 - val_recall: 0.5174 - 1s/epoch - 2ms/step
Epoch 74/100
558/558 - 1s - loss: 0.8026 - recall: 0.5566 - val_loss: 0.6664 - val_recall: 0.5235 - 1s/epoch - 2ms/step
Epoch 75/100
558/558 - 1s - loss: 0.7997 - recall: 0.5692 - val_loss: 0.6736 - val_recall: 0.5256 - 1s/epoch - 3ms/step
Epoch 76/100
558/558 - 1s - loss: 0.8069 - recall: 0.5557 - val_loss: 0.6702 - val_recall: 0.5378 - 1s/epoch - 2ms/step
Epoch 77/100
558/558 - 1s - loss: 0.7877 - recall: 0.5739 - val_loss: 0.6669 - val_recall: 0.5317 - 1s/epoch - 3ms/step
Epoch 78/100
558/558 - 2s - loss: 0.8027 - recall: 0.5757 - val_loss: 0.6629 - val_recall: 0.5276 - 2s/epoch - 3ms/step
Epoch 79/100
558/558 - 2s - loss: 0.8077 - recall: 0.5649 - val_loss: 0.6585 - val_recall: 0.5297 - 2s/epoch - 3ms/step
Epoch 80/100
558/558 - 2s - loss: 0.7885 - recall: 0.5699 - val_loss: 0.6673 - val_recall: 0.5358 - 2s/epoch - 4ms/step
Epoch 81/100
558/558 - 1s - loss: 0.7960 - recall: 0.5665 - val_loss: 0.6612 - val_recall: 0.5358 - 1s/epoch - 3ms/step
Epoch 82/100
558/558 - 1s - loss: 0.7954 - recall: 0.5690 - val_loss: 0.6623 - val_recall: 0.5378 - 1s/epoch - 2ms/step
Epoch 83/100
558/558 - 1s - loss: 0.7968 - recall: 0.5631 - val_loss: 0.6609 - val_recall: 0.5297 - 1s/epoch - 2ms/step
Epoch 84/100
558/558 - 1s - loss: 0.7806 - recall: 0.5791 - val_loss: 0.6602 - val_recall: 0.5317 - 1s/epoch - 3ms/step
Epoch 85/100
558/558 - 1s - loss: 0.7880 - recall: 0.5732 - val_loss: 0.6521 - val_recall: 0.5358 - 1s/epoch - 3ms/step
Epoch 86/100
558/558 - 1s - loss: 0.7895 - recall: 0.5719 - val_loss: 0.6547 - val_recall: 0.5297 - 1s/epoch - 3ms/step
Epoch 87/100
558/558 - 1s - loss: 0.7866 - recall: 0.5882 - val_loss: 0.6556 - val_recall: 0.5440 - 1s/epoch - 3ms/step
Epoch 88/100
558/558 - 2s - loss: 0.7910 - recall: 0.5687 - val_loss: 0.6582 - val_recall: 0.5399 - 2s/epoch - 3ms/step
Epoch 89/100
558/558 - 2s - loss: 0.7860 - recall: 0.5835 - val_loss: 0.6488 - val_recall: 0.5358 - 2s/epoch - 3ms/step
Epoch 90/100
558/558 - 2s - loss: 0.7808 - recall: 0.5804 - val_loss: 0.6531 - val_recall: 0.5542 - 2s/epoch - 3ms/step
Epoch 91/100
558/558 - 1s - loss: 0.7877 - recall: 0.5802 - val_loss: 0.6533 - val_recall: 0.5521 - 1s/epoch - 3ms/step
Epoch 92/100
558/558 - 1s - loss: 0.7712 - recall: 0.5777 - val_loss: 0.6477 - val_recall: 0.5440 - 1s/epoch - 3ms/step
Epoch 93/100
558/558 - 1s - loss: 0.7707 - recall: 0.5725 - val_loss: 0.6487 - val_recall: 0.5521 - 1s/epoch - 3ms/step
Epoch 94/100
558/558 - 1s - loss: 0.7758 - recall: 0.5824 - val_loss: 0.6565 - val_recall: 0.5665 - 1s/epoch - 2ms/step
Epoch 95/100
558/558 - 1s - loss: 0.7865 - recall: 0.5717 - val_loss: 0.6396 - val_recall: 0.5440 - 1s/epoch - 3ms/step
Epoch 96/100
558/558 - 1s - loss: 0.7748 - recall: 0.5795 - val_loss: 0.6405 - val_recall: 0.5481 - 1s/epoch - 2ms/step
Epoch 97/100
558/558 - 2s - loss: 0.7727 - recall: 0.5858 - val_loss: 0.6432 - val_recall: 0.5521 - 2s/epoch - 3ms/step
Epoch 98/100
558/558 - 2s - loss: 0.7752 - recall: 0.5737 - val_loss: 0.6387 - val_recall: 0.5542 - 2s/epoch - 3ms/step
Epoch 99/100
558/558 - 2s - loss: 0.7704 - recall: 0.5806 - val_loss: 0.6490 - val_recall: 0.5665 - 2s/epoch - 4ms/step
Epoch 100/100
558/558 - 2s - loss: 0.7589 - recall: 0.5981 - val_loss: 0.6504 - val_recall: 0.5706 - 2s/epoch - 3ms/step
plot(results.iloc[i]['history'],'loss')
plot(results.iloc[i]['history'],'recall')
results.iloc[i]
# hidden layers 2 # neurons - hidden layer [128, 128, 64] activation function - hidden layer [relu, relu] # epochs 100 batch size 16 optimizer SGD-Mom learning rate, momentum, dropout [1e-05, 0.0, 0.2] weight initializer he_uniform regularization - train loss 0.758918 validation loss 0.650358 train recall 0.598116 validation recall 0.570552 time (secs) 203.18 model <keras.src.engine.sequential.Sequential object... history <keras.src.callbacks.History object at 0x7dfcd... Name: 13, dtype: object
Neural Network built using SGD and 20% dropout ratio is not learing well to reduce the loss graidence. the value stands very high at 0.75 for training data.
train recall 0.598116 & validation recall 0.570552
Recall scores from the model is also very less, making it unsuitable
i+=1
model_fit_with_dropout('relu','relu','SGD-Mom',X_train_over,y_train_over,100,i,dropoutval=0.2,learning_rte=1e-5, momentumval=0.9)
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 128) 1536
dropout (Dropout) (None, 128) 0
dense_1 (Dense) (None, 128) 16512
batch_normalization (Batch (None, 128) 512
Normalization)
dropout_1 (Dropout) (None, 128) 0
dense_2 (Dense) (None, 64) 8256
batch_normalization_1 (Bat (None, 64) 256
chNormalization)
dropout_2 (Dropout) (None, 64) 0
dense_3 (Dense) (None, 1) 65
=================================================================
Total params: 27137 (106.00 KB)
Trainable params: 26753 (104.50 KB)
Non-trainable params: 384 (1.50 KB)
_________________________________________________________________
Epoch 1/100
558/558 - 6s - loss: 1.0108 - recall: 0.4864 - val_loss: 0.8172 - val_recall: 0.4008 - 6s/epoch - 10ms/step
Epoch 2/100
558/558 - 3s - loss: 0.9624 - recall: 0.4932 - val_loss: 0.7742 - val_recall: 0.4294 - 3s/epoch - 5ms/step
Epoch 3/100
558/558 - 3s - loss: 0.9212 - recall: 0.5111 - val_loss: 0.7420 - val_recall: 0.4438 - 3s/epoch - 5ms/step
Epoch 4/100
558/558 - 3s - loss: 0.9002 - recall: 0.5268 - val_loss: 0.7247 - val_recall: 0.4806 - 3s/epoch - 6ms/step
Epoch 5/100
558/558 - 4s - loss: 0.8660 - recall: 0.5340 - val_loss: 0.7095 - val_recall: 0.5133 - 4s/epoch - 8ms/step
Epoch 6/100
558/558 - 3s - loss: 0.8374 - recall: 0.5544 - val_loss: 0.6882 - val_recall: 0.5194 - 3s/epoch - 5ms/step
Epoch 7/100
558/558 - 3s - loss: 0.8256 - recall: 0.5537 - val_loss: 0.6632 - val_recall: 0.5133 - 3s/epoch - 5ms/step
Epoch 8/100
558/558 - 3s - loss: 0.8071 - recall: 0.5609 - val_loss: 0.6530 - val_recall: 0.5276 - 3s/epoch - 5ms/step
Epoch 9/100
558/558 - 5s - loss: 0.7886 - recall: 0.5725 - val_loss: 0.6491 - val_recall: 0.5440 - 5s/epoch - 8ms/step
Epoch 10/100
558/558 - 3s - loss: 0.7749 - recall: 0.5764 - val_loss: 0.6390 - val_recall: 0.5542 - 3s/epoch - 6ms/step
Epoch 11/100
558/558 - 3s - loss: 0.7623 - recall: 0.5905 - val_loss: 0.6383 - val_recall: 0.5685 - 3s/epoch - 5ms/step
Epoch 12/100
558/558 - 3s - loss: 0.7493 - recall: 0.6008 - val_loss: 0.6261 - val_recall: 0.5706 - 3s/epoch - 5ms/step
Epoch 13/100
558/558 - 2s - loss: 0.7336 - recall: 0.6008 - val_loss: 0.6208 - val_recall: 0.5808 - 2s/epoch - 4ms/step
Epoch 14/100
558/558 - 2s - loss: 0.7360 - recall: 0.6066 - val_loss: 0.6359 - val_recall: 0.6278 - 2s/epoch - 3ms/step
Epoch 15/100
558/558 - 2s - loss: 0.7259 - recall: 0.6060 - val_loss: 0.6192 - val_recall: 0.6176 - 2s/epoch - 4ms/step
Epoch 16/100
558/558 - 2s - loss: 0.7283 - recall: 0.6075 - val_loss: 0.6211 - val_recall: 0.6299 - 2s/epoch - 4ms/step
Epoch 17/100
558/558 - 1s - loss: 0.7076 - recall: 0.6158 - val_loss: 0.6155 - val_recall: 0.6360 - 1s/epoch - 3ms/step
Epoch 18/100
558/558 - 1s - loss: 0.7141 - recall: 0.6232 - val_loss: 0.6157 - val_recall: 0.6421 - 1s/epoch - 2ms/step
Epoch 19/100
558/558 - 1s - loss: 0.7093 - recall: 0.6190 - val_loss: 0.6111 - val_recall: 0.6421 - 1s/epoch - 3ms/step
Epoch 20/100
558/558 - 1s - loss: 0.6968 - recall: 0.6286 - val_loss: 0.6044 - val_recall: 0.6319 - 1s/epoch - 3ms/step
Epoch 21/100
558/558 - 1s - loss: 0.6932 - recall: 0.6203 - val_loss: 0.6113 - val_recall: 0.6503 - 1s/epoch - 3ms/step
Epoch 22/100
558/558 - 1s - loss: 0.6934 - recall: 0.6257 - val_loss: 0.6065 - val_recall: 0.6544 - 1s/epoch - 3ms/step
Epoch 23/100
558/558 - 2s - loss: 0.6927 - recall: 0.6268 - val_loss: 0.6111 - val_recall: 0.6544 - 2s/epoch - 3ms/step
Epoch 24/100
558/558 - 2s - loss: 0.6804 - recall: 0.6412 - val_loss: 0.6015 - val_recall: 0.6544 - 2s/epoch - 4ms/step
Epoch 25/100
558/558 - 2s - loss: 0.6879 - recall: 0.6367 - val_loss: 0.5967 - val_recall: 0.6483 - 2s/epoch - 3ms/step
Epoch 26/100
558/558 - 2s - loss: 0.6817 - recall: 0.6315 - val_loss: 0.5966 - val_recall: 0.6605 - 2s/epoch - 3ms/step
Epoch 27/100
558/558 - 1s - loss: 0.6838 - recall: 0.6304 - val_loss: 0.5968 - val_recall: 0.6605 - 1s/epoch - 3ms/step
Epoch 28/100
558/558 - 1s - loss: 0.6743 - recall: 0.6405 - val_loss: 0.5949 - val_recall: 0.6605 - 1s/epoch - 2ms/step
Epoch 29/100
558/558 - 1s - loss: 0.6730 - recall: 0.6425 - val_loss: 0.5895 - val_recall: 0.6544 - 1s/epoch - 3ms/step
Epoch 30/100
558/558 - 1s - loss: 0.6713 - recall: 0.6371 - val_loss: 0.5898 - val_recall: 0.6646 - 1s/epoch - 3ms/step
Epoch 31/100
558/558 - 1s - loss: 0.6664 - recall: 0.6459 - val_loss: 0.5911 - val_recall: 0.6687 - 1s/epoch - 3ms/step
Epoch 32/100
558/558 - 2s - loss: 0.6597 - recall: 0.6501 - val_loss: 0.5977 - val_recall: 0.6789 - 2s/epoch - 3ms/step
Epoch 33/100
558/558 - 2s - loss: 0.6631 - recall: 0.6463 - val_loss: 0.5898 - val_recall: 0.6728 - 2s/epoch - 4ms/step
Epoch 34/100
558/558 - 2s - loss: 0.6652 - recall: 0.6427 - val_loss: 0.5893 - val_recall: 0.6687 - 2s/epoch - 3ms/step
Epoch 35/100
558/558 - 1s - loss: 0.6636 - recall: 0.6510 - val_loss: 0.5971 - val_recall: 0.6748 - 1s/epoch - 2ms/step
Epoch 36/100
558/558 - 1s - loss: 0.6663 - recall: 0.6470 - val_loss: 0.5939 - val_recall: 0.6769 - 1s/epoch - 3ms/step
Epoch 37/100
558/558 - 1s - loss: 0.6578 - recall: 0.6508 - val_loss: 0.5867 - val_recall: 0.6728 - 1s/epoch - 3ms/step
Epoch 38/100
558/558 - 1s - loss: 0.6584 - recall: 0.6452 - val_loss: 0.5925 - val_recall: 0.6851 - 1s/epoch - 3ms/step
Epoch 39/100
558/558 - 1s - loss: 0.6506 - recall: 0.6468 - val_loss: 0.5859 - val_recall: 0.6789 - 1s/epoch - 3ms/step
Epoch 40/100
558/558 - 2s - loss: 0.6562 - recall: 0.6535 - val_loss: 0.5859 - val_recall: 0.6830 - 2s/epoch - 3ms/step
Epoch 41/100
558/558 - 2s - loss: 0.6555 - recall: 0.6587 - val_loss: 0.5778 - val_recall: 0.6748 - 2s/epoch - 3ms/step
Epoch 42/100
558/558 - 2s - loss: 0.6519 - recall: 0.6544 - val_loss: 0.5904 - val_recall: 0.6933 - 2s/epoch - 4ms/step
Epoch 43/100
558/558 - 2s - loss: 0.6454 - recall: 0.6641 - val_loss: 0.5897 - val_recall: 0.6933 - 2s/epoch - 4ms/step
Epoch 44/100
558/558 - 1s - loss: 0.6483 - recall: 0.6537 - val_loss: 0.5813 - val_recall: 0.6892 - 1s/epoch - 3ms/step
Epoch 45/100
558/558 - 1s - loss: 0.6470 - recall: 0.6578 - val_loss: 0.5775 - val_recall: 0.6810 - 1s/epoch - 3ms/step
Epoch 46/100
558/558 - 1s - loss: 0.6469 - recall: 0.6555 - val_loss: 0.5852 - val_recall: 0.7014 - 1s/epoch - 3ms/step
Epoch 47/100
558/558 - 1s - loss: 0.6447 - recall: 0.6618 - val_loss: 0.5818 - val_recall: 0.6953 - 1s/epoch - 3ms/step
Epoch 48/100
558/558 - 1s - loss: 0.6528 - recall: 0.6578 - val_loss: 0.5829 - val_recall: 0.6953 - 1s/epoch - 3ms/step
Epoch 49/100
558/558 - 1s - loss: 0.6425 - recall: 0.6609 - val_loss: 0.5791 - val_recall: 0.6973 - 1s/epoch - 3ms/step
Epoch 50/100
558/558 - 2s - loss: 0.6407 - recall: 0.6562 - val_loss: 0.5752 - val_recall: 0.6933 - 2s/epoch - 3ms/step
Epoch 51/100
558/558 - 2s - loss: 0.6416 - recall: 0.6647 - val_loss: 0.5898 - val_recall: 0.7076 - 2s/epoch - 4ms/step
Epoch 52/100
558/558 - 2s - loss: 0.6402 - recall: 0.6515 - val_loss: 0.5877 - val_recall: 0.7055 - 2s/epoch - 4ms/step
Epoch 53/100
558/558 - 1s - loss: 0.6365 - recall: 0.6616 - val_loss: 0.5807 - val_recall: 0.7055 - 1s/epoch - 3ms/step
Epoch 54/100
558/558 - 1s - loss: 0.6378 - recall: 0.6658 - val_loss: 0.5744 - val_recall: 0.7014 - 1s/epoch - 3ms/step
Epoch 55/100
558/558 - 1s - loss: 0.6319 - recall: 0.6625 - val_loss: 0.5720 - val_recall: 0.6953 - 1s/epoch - 3ms/step
Epoch 56/100
558/558 - 1s - loss: 0.6373 - recall: 0.6591 - val_loss: 0.5863 - val_recall: 0.7055 - 1s/epoch - 3ms/step
Epoch 57/100
558/558 - 1s - loss: 0.6295 - recall: 0.6638 - val_loss: 0.5748 - val_recall: 0.6973 - 1s/epoch - 3ms/step
Epoch 58/100
558/558 - 1s - loss: 0.6313 - recall: 0.6681 - val_loss: 0.5734 - val_recall: 0.7014 - 1s/epoch - 3ms/step
Epoch 59/100
558/558 - 2s - loss: 0.6337 - recall: 0.6584 - val_loss: 0.5778 - val_recall: 0.7035 - 2s/epoch - 3ms/step
Epoch 60/100
558/558 - 2s - loss: 0.6214 - recall: 0.6728 - val_loss: 0.5789 - val_recall: 0.7096 - 2s/epoch - 4ms/step
Epoch 61/100
558/558 - 2s - loss: 0.6298 - recall: 0.6703 - val_loss: 0.5750 - val_recall: 0.7014 - 2s/epoch - 4ms/step
Epoch 62/100
558/558 - 1s - loss: 0.6281 - recall: 0.6638 - val_loss: 0.5800 - val_recall: 0.7096 - 1s/epoch - 2ms/step
Epoch 63/100
558/558 - 1s - loss: 0.6272 - recall: 0.6602 - val_loss: 0.5817 - val_recall: 0.7117 - 1s/epoch - 2ms/step
Epoch 64/100
558/558 - 1s - loss: 0.6244 - recall: 0.6739 - val_loss: 0.5822 - val_recall: 0.7137 - 1s/epoch - 3ms/step
Epoch 65/100
558/558 - 2s - loss: 0.6343 - recall: 0.6629 - val_loss: 0.5828 - val_recall: 0.7178 - 2s/epoch - 3ms/step
Epoch 66/100
558/558 - 1s - loss: 0.6262 - recall: 0.6732 - val_loss: 0.5762 - val_recall: 0.7117 - 1s/epoch - 3ms/step
Epoch 67/100
558/558 - 1s - loss: 0.6216 - recall: 0.6748 - val_loss: 0.5798 - val_recall: 0.7178 - 1s/epoch - 3ms/step
Epoch 68/100
558/558 - 2s - loss: 0.6278 - recall: 0.6771 - val_loss: 0.5650 - val_recall: 0.6994 - 2s/epoch - 4ms/step
Epoch 69/100
558/558 - 2s - loss: 0.6214 - recall: 0.6768 - val_loss: 0.5731 - val_recall: 0.7096 - 2s/epoch - 4ms/step
Epoch 70/100
558/558 - 2s - loss: 0.6227 - recall: 0.6757 - val_loss: 0.5809 - val_recall: 0.7198 - 2s/epoch - 4ms/step
Epoch 71/100
558/558 - 1s - loss: 0.6167 - recall: 0.6701 - val_loss: 0.5832 - val_recall: 0.7178 - 1s/epoch - 2ms/step
Epoch 72/100
558/558 - 1s - loss: 0.6162 - recall: 0.6827 - val_loss: 0.5726 - val_recall: 0.7178 - 1s/epoch - 2ms/step
Epoch 73/100
558/558 - 1s - loss: 0.6260 - recall: 0.6688 - val_loss: 0.5783 - val_recall: 0.7239 - 1s/epoch - 3ms/step
Epoch 74/100
558/558 - 1s - loss: 0.6160 - recall: 0.6802 - val_loss: 0.5734 - val_recall: 0.7219 - 1s/epoch - 2ms/step
Epoch 75/100
558/558 - 1s - loss: 0.6171 - recall: 0.6759 - val_loss: 0.5730 - val_recall: 0.7178 - 1s/epoch - 3ms/step
Epoch 76/100
558/558 - 1s - loss: 0.6215 - recall: 0.6766 - val_loss: 0.5771 - val_recall: 0.7219 - 1s/epoch - 3ms/step
Epoch 77/100
558/558 - 1s - loss: 0.6086 - recall: 0.6901 - val_loss: 0.5735 - val_recall: 0.7198 - 1s/epoch - 3ms/step
Epoch 78/100
558/558 - 2s - loss: 0.6193 - recall: 0.6739 - val_loss: 0.5808 - val_recall: 0.7260 - 2s/epoch - 4ms/step
Epoch 79/100
558/558 - 2s - loss: 0.6192 - recall: 0.6775 - val_loss: 0.5755 - val_recall: 0.7239 - 2s/epoch - 4ms/step
Epoch 80/100
558/558 - 2s - loss: 0.6108 - recall: 0.6768 - val_loss: 0.5813 - val_recall: 0.7280 - 2s/epoch - 3ms/step
Epoch 81/100
558/558 - 1s - loss: 0.6120 - recall: 0.6782 - val_loss: 0.5725 - val_recall: 0.7219 - 1s/epoch - 2ms/step
Epoch 82/100
558/558 - 1s - loss: 0.6106 - recall: 0.6793 - val_loss: 0.5837 - val_recall: 0.7342 - 1s/epoch - 2ms/step
Epoch 83/100
558/558 - 1s - loss: 0.6145 - recall: 0.6766 - val_loss: 0.5748 - val_recall: 0.7260 - 1s/epoch - 2ms/step
Epoch 84/100
558/558 - 1s - loss: 0.6130 - recall: 0.6732 - val_loss: 0.5739 - val_recall: 0.7301 - 1s/epoch - 3ms/step
Epoch 85/100
558/558 - 1s - loss: 0.6140 - recall: 0.6777 - val_loss: 0.5685 - val_recall: 0.7137 - 1s/epoch - 3ms/step
Epoch 86/100
558/558 - 1s - loss: 0.6111 - recall: 0.6809 - val_loss: 0.5736 - val_recall: 0.7219 - 1s/epoch - 3ms/step
Epoch 87/100
558/558 - 2s - loss: 0.6102 - recall: 0.6782 - val_loss: 0.5780 - val_recall: 0.7342 - 2s/epoch - 3ms/step
Epoch 88/100
558/558 - 2s - loss: 0.6136 - recall: 0.6860 - val_loss: 0.5815 - val_recall: 0.7382 - 2s/epoch - 4ms/step
Epoch 89/100
558/558 - 2s - loss: 0.6139 - recall: 0.6777 - val_loss: 0.5704 - val_recall: 0.7280 - 2s/epoch - 3ms/step
Epoch 90/100
558/558 - 1s - loss: 0.6148 - recall: 0.6827 - val_loss: 0.5773 - val_recall: 0.7301 - 1s/epoch - 2ms/step
Epoch 91/100
558/558 - 1s - loss: 0.6124 - recall: 0.6820 - val_loss: 0.5766 - val_recall: 0.7342 - 1s/epoch - 3ms/step
Epoch 92/100
558/558 - 1s - loss: 0.6061 - recall: 0.6901 - val_loss: 0.5728 - val_recall: 0.7301 - 1s/epoch - 2ms/step
Epoch 93/100
558/558 - 1s - loss: 0.6013 - recall: 0.6858 - val_loss: 0.5698 - val_recall: 0.7260 - 1s/epoch - 3ms/step
Epoch 94/100
558/558 - 1s - loss: 0.6012 - recall: 0.6896 - val_loss: 0.5828 - val_recall: 0.7464 - 1s/epoch - 3ms/step
Epoch 95/100
558/558 - 1s - loss: 0.6110 - recall: 0.6661 - val_loss: 0.5670 - val_recall: 0.7219 - 1s/epoch - 3ms/step
Epoch 96/100
558/558 - 2s - loss: 0.6040 - recall: 0.6880 - val_loss: 0.5735 - val_recall: 0.7362 - 2s/epoch - 3ms/step
Epoch 97/100
558/558 - 2s - loss: 0.6042 - recall: 0.6871 - val_loss: 0.5720 - val_recall: 0.7301 - 2s/epoch - 4ms/step
Epoch 98/100
558/558 - 2s - loss: 0.6115 - recall: 0.6784 - val_loss: 0.5746 - val_recall: 0.7321 - 2s/epoch - 4ms/step
Epoch 99/100
558/558 - 1s - loss: 0.6026 - recall: 0.6880 - val_loss: 0.5740 - val_recall: 0.7382 - 1s/epoch - 2ms/step
Epoch 100/100
558/558 - 1s - loss: 0.5990 - recall: 0.6880 - val_loss: 0.5684 - val_recall: 0.7239 - 1s/epoch - 3ms/step
plot(results.iloc[i]['history'],'loss')
plot(results.iloc[i]['history'],'recall')
results.iloc[i]
# hidden layers 2 # neurons - hidden layer [128, 128, 64] activation function - hidden layer [relu, relu] # epochs 100 batch size 16 optimizer SGD-Mom learning rate, momentum, dropout [1e-05, 0.9, 0.2] weight initializer he_uniform regularization - train loss 0.59902 validation loss 0.568359 train recall 0.688047 validation recall 0.723926 time (secs) 204.87 model <keras.src.engine.sequential.Sequential object... history <keras.src.callbacks.History object at 0x7dfcd... Name: 16, dtype: object
Neural Network built using SGD with moment and 20% dropout ratio shows a good reduction in the loss gradience showing that it is learning.
train recall 0.688047 & validation recall 0.723926
Recall scores from the model is descent, so we can try this model for the test data.
i+=1
model_fit_with_dropout('relu','relu','SGD-Mom',X_train_over,y_train_over,100,i,dropoutval=0.3,learning_rte=1e-5, momentumval=0.75)
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 128) 1536
dropout (Dropout) (None, 128) 0
dense_1 (Dense) (None, 128) 16512
batch_normalization (Batch (None, 128) 512
Normalization)
dropout_1 (Dropout) (None, 128) 0
dense_2 (Dense) (None, 64) 8256
batch_normalization_1 (Bat (None, 64) 256
chNormalization)
dropout_2 (Dropout) (None, 64) 0
dense_3 (Dense) (None, 1) 65
=================================================================
Total params: 27137 (106.00 KB)
Trainable params: 26753 (104.50 KB)
Non-trainable params: 384 (1.50 KB)
_________________________________________________________________
Epoch 1/100
558/558 - 5s - loss: 1.0419 - recall: 0.4804 - val_loss: 0.7691 - val_recall: 0.3374 - 5s/epoch - 9ms/step
Epoch 2/100
558/558 - 1s - loss: 1.0349 - recall: 0.4797 - val_loss: 0.7538 - val_recall: 0.3476 - 1s/epoch - 3ms/step
Epoch 3/100
558/558 - 2s - loss: 1.0166 - recall: 0.4819 - val_loss: 0.7488 - val_recall: 0.3558 - 2s/epoch - 3ms/step
Epoch 4/100
558/558 - 1s - loss: 1.0038 - recall: 0.4994 - val_loss: 0.7504 - val_recall: 0.3967 - 1s/epoch - 3ms/step
Epoch 5/100
558/558 - 1s - loss: 0.9916 - recall: 0.4905 - val_loss: 0.7336 - val_recall: 0.3865 - 1s/epoch - 3ms/step
Epoch 6/100
558/558 - 1s - loss: 0.9678 - recall: 0.4990 - val_loss: 0.7212 - val_recall: 0.4049 - 1s/epoch - 2ms/step
Epoch 7/100
558/558 - 1s - loss: 0.9630 - recall: 0.5098 - val_loss: 0.7048 - val_recall: 0.3947 - 1s/epoch - 3ms/step
Epoch 8/100
558/558 - 2s - loss: 0.9350 - recall: 0.5115 - val_loss: 0.7045 - val_recall: 0.4029 - 2s/epoch - 3ms/step
Epoch 9/100
558/558 - 2s - loss: 0.9353 - recall: 0.5183 - val_loss: 0.6953 - val_recall: 0.4110 - 2s/epoch - 4ms/step
Epoch 10/100
558/558 - 2s - loss: 0.9383 - recall: 0.5201 - val_loss: 0.6941 - val_recall: 0.4254 - 2s/epoch - 3ms/step
Epoch 11/100
558/558 - 1s - loss: 0.9210 - recall: 0.5237 - val_loss: 0.6926 - val_recall: 0.4356 - 1s/epoch - 3ms/step
Epoch 12/100
558/558 - 1s - loss: 0.9076 - recall: 0.5214 - val_loss: 0.6783 - val_recall: 0.4192 - 1s/epoch - 3ms/step
Epoch 13/100
558/558 - 1s - loss: 0.8885 - recall: 0.5232 - val_loss: 0.6653 - val_recall: 0.4090 - 1s/epoch - 2ms/step
Epoch 14/100
558/558 - 1s - loss: 0.8902 - recall: 0.5335 - val_loss: 0.6786 - val_recall: 0.4622 - 1s/epoch - 2ms/step
Epoch 15/100
558/558 - 1s - loss: 0.8725 - recall: 0.5434 - val_loss: 0.6668 - val_recall: 0.4335 - 1s/epoch - 3ms/step
Epoch 16/100
558/558 - 1s - loss: 0.8774 - recall: 0.5353 - val_loss: 0.6641 - val_recall: 0.4499 - 1s/epoch - 2ms/step
Epoch 17/100
558/558 - 2s - loss: 0.8510 - recall: 0.5436 - val_loss: 0.6511 - val_recall: 0.4458 - 2s/epoch - 3ms/step
Epoch 18/100
558/558 - 2s - loss: 0.8646 - recall: 0.5423 - val_loss: 0.6539 - val_recall: 0.4560 - 2s/epoch - 4ms/step
Epoch 19/100
558/558 - 3s - loss: 0.8573 - recall: 0.5423 - val_loss: 0.6604 - val_recall: 0.4928 - 3s/epoch - 6ms/step
Epoch 20/100
558/558 - 2s - loss: 0.8400 - recall: 0.5560 - val_loss: 0.6364 - val_recall: 0.4601 - 2s/epoch - 3ms/step
Epoch 21/100
558/558 - 1s - loss: 0.8376 - recall: 0.5474 - val_loss: 0.6451 - val_recall: 0.4785 - 1s/epoch - 2ms/step
Epoch 22/100
558/558 - 1s - loss: 0.8308 - recall: 0.5510 - val_loss: 0.6376 - val_recall: 0.4969 - 1s/epoch - 2ms/step
Epoch 23/100
558/558 - 1s - loss: 0.8213 - recall: 0.5631 - val_loss: 0.6452 - val_recall: 0.5133 - 1s/epoch - 2ms/step
Epoch 24/100
558/558 - 1s - loss: 0.8163 - recall: 0.5631 - val_loss: 0.6388 - val_recall: 0.5153 - 1s/epoch - 2ms/step
Epoch 25/100
558/558 - 1s - loss: 0.8251 - recall: 0.5553 - val_loss: 0.6396 - val_recall: 0.5133 - 1s/epoch - 3ms/step
Epoch 26/100
558/558 - 2s - loss: 0.8172 - recall: 0.5548 - val_loss: 0.6265 - val_recall: 0.4949 - 2s/epoch - 3ms/step
Epoch 27/100
558/558 - 2s - loss: 0.8211 - recall: 0.5501 - val_loss: 0.6270 - val_recall: 0.5194 - 2s/epoch - 4ms/step
Epoch 28/100
558/558 - 2s - loss: 0.8107 - recall: 0.5663 - val_loss: 0.6187 - val_recall: 0.5112 - 2s/epoch - 4ms/step
Epoch 29/100
558/558 - 2s - loss: 0.8095 - recall: 0.5616 - val_loss: 0.6239 - val_recall: 0.5256 - 2s/epoch - 3ms/step
Epoch 30/100
558/558 - 2s - loss: 0.8069 - recall: 0.5609 - val_loss: 0.6280 - val_recall: 0.5481 - 2s/epoch - 3ms/step
Epoch 31/100
558/558 - 1s - loss: 0.7923 - recall: 0.5667 - val_loss: 0.6261 - val_recall: 0.5460 - 1s/epoch - 3ms/step
Epoch 32/100
558/558 - 1s - loss: 0.7917 - recall: 0.5710 - val_loss: 0.6259 - val_recall: 0.5644 - 1s/epoch - 3ms/step
Epoch 33/100
558/558 - 1s - loss: 0.7909 - recall: 0.5687 - val_loss: 0.6201 - val_recall: 0.5542 - 1s/epoch - 2ms/step
Epoch 34/100
558/558 - 1s - loss: 0.7946 - recall: 0.5669 - val_loss: 0.6203 - val_recall: 0.5583 - 1s/epoch - 3ms/step
Epoch 35/100
558/558 - 2s - loss: 0.7928 - recall: 0.5750 - val_loss: 0.6235 - val_recall: 0.5726 - 2s/epoch - 3ms/step
Epoch 36/100
558/558 - 2s - loss: 0.7763 - recall: 0.5788 - val_loss: 0.6212 - val_recall: 0.5828 - 2s/epoch - 4ms/step
Epoch 37/100
558/558 - 2s - loss: 0.7756 - recall: 0.5808 - val_loss: 0.6180 - val_recall: 0.5808 - 2s/epoch - 4ms/step
Epoch 38/100
558/558 - 1s - loss: 0.7797 - recall: 0.5755 - val_loss: 0.6182 - val_recall: 0.5808 - 1s/epoch - 3ms/step
Epoch 39/100
558/558 - 1s - loss: 0.7696 - recall: 0.5856 - val_loss: 0.6152 - val_recall: 0.5808 - 1s/epoch - 3ms/step
Epoch 40/100
558/558 - 1s - loss: 0.7725 - recall: 0.5717 - val_loss: 0.6119 - val_recall: 0.5869 - 1s/epoch - 3ms/step
Epoch 41/100
558/558 - 1s - loss: 0.7748 - recall: 0.5793 - val_loss: 0.6083 - val_recall: 0.5808 - 1s/epoch - 3ms/step
Epoch 42/100
558/558 - 1s - loss: 0.7720 - recall: 0.5793 - val_loss: 0.6151 - val_recall: 0.5971 - 1s/epoch - 2ms/step
Epoch 43/100
558/558 - 1s - loss: 0.7567 - recall: 0.5898 - val_loss: 0.6193 - val_recall: 0.6074 - 1s/epoch - 3ms/step
Epoch 44/100
558/558 - 1s - loss: 0.7515 - recall: 0.5871 - val_loss: 0.6063 - val_recall: 0.5849 - 1s/epoch - 3ms/step
Epoch 45/100
558/558 - 2s - loss: 0.7619 - recall: 0.5885 - val_loss: 0.6047 - val_recall: 0.5828 - 2s/epoch - 4ms/step
Epoch 46/100
558/558 - 2s - loss: 0.7532 - recall: 0.5871 - val_loss: 0.6055 - val_recall: 0.5951 - 2s/epoch - 4ms/step
Epoch 47/100
558/558 - 2s - loss: 0.7532 - recall: 0.5860 - val_loss: 0.6083 - val_recall: 0.6033 - 2s/epoch - 3ms/step
Epoch 48/100
558/558 - 1s - loss: 0.7549 - recall: 0.5882 - val_loss: 0.6081 - val_recall: 0.6074 - 1s/epoch - 2ms/step
Epoch 49/100
558/558 - 1s - loss: 0.7484 - recall: 0.5918 - val_loss: 0.6034 - val_recall: 0.5971 - 1s/epoch - 3ms/step
Epoch 50/100
558/558 - 1s - loss: 0.7517 - recall: 0.5894 - val_loss: 0.6010 - val_recall: 0.6053 - 1s/epoch - 3ms/step
Epoch 51/100
558/558 - 1s - loss: 0.7465 - recall: 0.5930 - val_loss: 0.6076 - val_recall: 0.6155 - 1s/epoch - 2ms/step
Epoch 52/100
558/558 - 1s - loss: 0.7437 - recall: 0.5921 - val_loss: 0.6132 - val_recall: 0.6319 - 1s/epoch - 3ms/step
Epoch 53/100
558/558 - 1s - loss: 0.7475 - recall: 0.5936 - val_loss: 0.6110 - val_recall: 0.6237 - 1s/epoch - 2ms/step
Epoch 54/100
558/558 - 2s - loss: 0.7342 - recall: 0.5972 - val_loss: 0.5946 - val_recall: 0.6012 - 2s/epoch - 3ms/step
Epoch 55/100
558/558 - 2s - loss: 0.7315 - recall: 0.5905 - val_loss: 0.5988 - val_recall: 0.6053 - 2s/epoch - 4ms/step
Epoch 56/100
558/558 - 2s - loss: 0.7401 - recall: 0.5945 - val_loss: 0.6095 - val_recall: 0.6360 - 2s/epoch - 3ms/step
Epoch 57/100
558/558 - 1s - loss: 0.7332 - recall: 0.5977 - val_loss: 0.5983 - val_recall: 0.6094 - 1s/epoch - 2ms/step
Epoch 58/100
558/558 - 1s - loss: 0.7295 - recall: 0.6024 - val_loss: 0.5962 - val_recall: 0.6217 - 1s/epoch - 3ms/step
Epoch 59/100
558/558 - 1s - loss: 0.7286 - recall: 0.6015 - val_loss: 0.6034 - val_recall: 0.6299 - 1s/epoch - 3ms/step
Epoch 60/100
558/558 - 1s - loss: 0.7261 - recall: 0.5997 - val_loss: 0.6047 - val_recall: 0.6483 - 1s/epoch - 3ms/step
Epoch 61/100
558/558 - 1s - loss: 0.7227 - recall: 0.6060 - val_loss: 0.5985 - val_recall: 0.6299 - 1s/epoch - 3ms/step
Epoch 62/100
558/558 - 1s - loss: 0.7305 - recall: 0.5939 - val_loss: 0.6048 - val_recall: 0.6483 - 1s/epoch - 3ms/step
Epoch 63/100
558/558 - 2s - loss: 0.7192 - recall: 0.6064 - val_loss: 0.6002 - val_recall: 0.6442 - 2s/epoch - 3ms/step
Epoch 64/100
558/558 - 2s - loss: 0.7194 - recall: 0.6033 - val_loss: 0.6037 - val_recall: 0.6483 - 2s/epoch - 4ms/step
Epoch 65/100
558/558 - 2s - loss: 0.7284 - recall: 0.5954 - val_loss: 0.6064 - val_recall: 0.6605 - 2s/epoch - 3ms/step
Epoch 66/100
558/558 - 1s - loss: 0.7137 - recall: 0.6098 - val_loss: 0.6049 - val_recall: 0.6524 - 1s/epoch - 3ms/step
Epoch 67/100
558/558 - 1s - loss: 0.7152 - recall: 0.6037 - val_loss: 0.6048 - val_recall: 0.6564 - 1s/epoch - 2ms/step
Epoch 68/100
558/558 - 1s - loss: 0.7200 - recall: 0.6046 - val_loss: 0.5887 - val_recall: 0.6237 - 1s/epoch - 3ms/step
Epoch 69/100
558/558 - 1s - loss: 0.7026 - recall: 0.6226 - val_loss: 0.5980 - val_recall: 0.6401 - 1s/epoch - 2ms/step
Epoch 70/100
558/558 - 1s - loss: 0.7145 - recall: 0.6033 - val_loss: 0.6083 - val_recall: 0.6605 - 1s/epoch - 3ms/step
Epoch 71/100
558/558 - 2s - loss: 0.7074 - recall: 0.6082 - val_loss: 0.6036 - val_recall: 0.6544 - 2s/epoch - 3ms/step
Epoch 72/100
558/558 - 2s - loss: 0.7045 - recall: 0.6147 - val_loss: 0.5969 - val_recall: 0.6380 - 2s/epoch - 3ms/step
Epoch 73/100
558/558 - 2s - loss: 0.7161 - recall: 0.6037 - val_loss: 0.5962 - val_recall: 0.6503 - 2s/epoch - 4ms/step
Epoch 74/100
558/558 - 2s - loss: 0.7038 - recall: 0.6129 - val_loss: 0.6003 - val_recall: 0.6503 - 2s/epoch - 3ms/step
Epoch 75/100
558/558 - 1s - loss: 0.6983 - recall: 0.6217 - val_loss: 0.5997 - val_recall: 0.6544 - 1s/epoch - 3ms/step
Epoch 76/100
558/558 - 1s - loss: 0.7136 - recall: 0.6078 - val_loss: 0.6008 - val_recall: 0.6503 - 1s/epoch - 2ms/step
Epoch 77/100
558/558 - 1s - loss: 0.6971 - recall: 0.6244 - val_loss: 0.6002 - val_recall: 0.6544 - 1s/epoch - 2ms/step
Epoch 78/100
558/558 - 1s - loss: 0.7002 - recall: 0.6174 - val_loss: 0.6056 - val_recall: 0.6605 - 1s/epoch - 3ms/step
Epoch 79/100
558/558 - 2s - loss: 0.7065 - recall: 0.6107 - val_loss: 0.6018 - val_recall: 0.6544 - 2s/epoch - 3ms/step
Epoch 80/100
558/558 - 2s - loss: 0.7005 - recall: 0.6140 - val_loss: 0.6080 - val_recall: 0.6667 - 2s/epoch - 3ms/step
Epoch 81/100
558/558 - 2s - loss: 0.6982 - recall: 0.6122 - val_loss: 0.5984 - val_recall: 0.6585 - 2s/epoch - 3ms/step
Epoch 82/100
558/558 - 2s - loss: 0.6910 - recall: 0.6183 - val_loss: 0.6074 - val_recall: 0.6728 - 2s/epoch - 4ms/step
Epoch 83/100
558/558 - 2s - loss: 0.6993 - recall: 0.6221 - val_loss: 0.5974 - val_recall: 0.6503 - 2s/epoch - 3ms/step
Epoch 84/100
558/558 - 1s - loss: 0.6972 - recall: 0.6176 - val_loss: 0.5957 - val_recall: 0.6503 - 1s/epoch - 2ms/step
Epoch 85/100
558/558 - 2s - loss: 0.7001 - recall: 0.6232 - val_loss: 0.5956 - val_recall: 0.6564 - 2s/epoch - 3ms/step
Epoch 86/100
558/558 - 1s - loss: 0.6954 - recall: 0.6172 - val_loss: 0.6015 - val_recall: 0.6667 - 1s/epoch - 3ms/step
Epoch 87/100
558/558 - 1s - loss: 0.6909 - recall: 0.6205 - val_loss: 0.6007 - val_recall: 0.6646 - 1s/epoch - 3ms/step
Epoch 88/100
558/558 - 1s - loss: 0.6949 - recall: 0.6201 - val_loss: 0.6047 - val_recall: 0.6728 - 1s/epoch - 3ms/step
Epoch 89/100
558/558 - 1s - loss: 0.6916 - recall: 0.6239 - val_loss: 0.5946 - val_recall: 0.6605 - 1s/epoch - 2ms/step
Epoch 90/100
558/558 - 2s - loss: 0.6921 - recall: 0.6288 - val_loss: 0.6014 - val_recall: 0.6687 - 2s/epoch - 4ms/step
Epoch 91/100
558/558 - 2s - loss: 0.6852 - recall: 0.6277 - val_loss: 0.6055 - val_recall: 0.6810 - 2s/epoch - 4ms/step
Epoch 92/100
558/558 - 2s - loss: 0.6913 - recall: 0.6248 - val_loss: 0.6002 - val_recall: 0.6728 - 2s/epoch - 3ms/step
Epoch 93/100
558/558 - 1s - loss: 0.6774 - recall: 0.6371 - val_loss: 0.6001 - val_recall: 0.6728 - 1s/epoch - 2ms/step
Epoch 94/100
558/558 - 1s - loss: 0.6851 - recall: 0.6264 - val_loss: 0.6058 - val_recall: 0.6769 - 1s/epoch - 3ms/step
Epoch 95/100
558/558 - 1s - loss: 0.6918 - recall: 0.6187 - val_loss: 0.5993 - val_recall: 0.6728 - 1s/epoch - 3ms/step
Epoch 96/100
558/558 - 1s - loss: 0.6845 - recall: 0.6205 - val_loss: 0.6004 - val_recall: 0.6748 - 1s/epoch - 3ms/step
Epoch 97/100
558/558 - 1s - loss: 0.6812 - recall: 0.6309 - val_loss: 0.5928 - val_recall: 0.6585 - 1s/epoch - 3ms/step
Epoch 98/100
558/558 - 1s - loss: 0.6852 - recall: 0.6282 - val_loss: 0.6007 - val_recall: 0.6810 - 1s/epoch - 3ms/step
Epoch 99/100
558/558 - 2s - loss: 0.6855 - recall: 0.6183 - val_loss: 0.6037 - val_recall: 0.6789 - 2s/epoch - 3ms/step
Epoch 100/100
558/558 - 2s - loss: 0.6832 - recall: 0.6302 - val_loss: 0.5969 - val_recall: 0.6728 - 2s/epoch - 4ms/step
plot(results.iloc[i]['history'],'loss')
plot(results.iloc[i]['history'],'recall')
results.iloc[i]
# hidden layers 2 # neurons - hidden layer [128, 128, 64] activation function - hidden layer [relu, relu] # epochs 100 batch size 16 optimizer SGD-Mom learning rate, momentum, dropout [1e-05, 0.75, 0.3] weight initializer he_uniform regularization - train loss 0.683199 validation loss 0.596861 train recall 0.630186 validation recall 0.672802 time (secs) 205.1 model <keras.src.engine.sequential.Sequential object... history <keras.src.callbacks.History object at 0x7dfcd... Name: 17, dtype: object
Below table summarizes the results from the various scenarios
# results.sort_values(by='train recall',axis=0,ascending=False)
results
| # hidden layers | # neurons - hidden layer | activation function - hidden layer | # epochs | batch size | optimizer | learning rate, momentum, dropout | weight initializer | regularization | train loss | validation loss | train recall | validation recall | time (secs) | model | history | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | [128, 128, 64] | [relu, relu] | 100 | 16 | AdaGrad | [0.001, 0.0, 0] | he_uniform | - | 0.349060 | 0.399920 | 0.450482 | 0.433538 | 117.99 | <keras.src.engine.sequential.Sequential object... | <keras.src.callbacks.History object at 0x7dfce... |
| 1 | 2 | [128, 128, 64] | [relu, relu] | 100 | 16 | SGD | [0.001, 0.0, 0] | he_uniform | - | 0.307272 | 0.410196 | 0.554776 | 0.468303 | 115.66 | <keras.src.engine.sequential.Sequential object... | <keras.src.callbacks.History object at 0x7dfce... |
| 2 | 2 | [128, 128, 64] | [relu, relu] | 100 | 16 | SGD-Mom | [0.001, 0.5, 0] | he_uniform | - | 0.260472 | 0.462195 | 0.626643 | 0.456033 | 143.31 | <keras.src.engine.sequential.Sequential object... | <keras.src.callbacks.History object at 0x7dfce... |
| 3 | 2 | [128, 128, 64] | [relu, relu] | 100 | 16 | RMS | [0.0001, 0.75, 0] | he_uniform | - | 0.153165 | 0.725464 | 0.816827 | 0.482618 | 134.90 | <keras.src.engine.sequential.Sequential object... | <keras.src.callbacks.History object at 0x7dfce... |
| 4 | 2 | [128, 128, 64] | [relu, relu] | 100 | 16 | Adam | [0.0001, 0.0, 0] | he_uniform | - | 0.161517 | 0.646090 | 0.792287 | 0.462168 | 143.60 | <keras.src.engine.sequential.Sequential object... | <keras.src.callbacks.History object at 0x7dfce... |
| 5 | 2 | [128, 128, 64] | [relu, relu] | 100 | 16 | Adam | [0.0001, 0.0, 0.2] | he_uniform | - | 0.362576 | 0.347113 | 0.441718 | 0.480573 | 142.95 | <keras.src.engine.sequential.Sequential object... | <keras.src.callbacks.History object at 0x7dfce... |
| 6 | 2 | [128, 128, 64] | [relu, relu] | 100 | 16 | Adam | [0.0001, 0.0, 0.3] | he_uniform | - | 0.384523 | 0.349065 | 0.373357 | 0.437628 | 143.73 | <keras.src.engine.sequential.Sequential object... | <keras.src.callbacks.History object at 0x7dfce... |
| 7 | 2 | [128, 128, 64] | [relu, relu] | 100 | 16 | SGD | [0.001, 0.0, 0] | he_uniform | - | 0.307250 | 0.537618 | 0.872169 | 0.642127 | 177.00 | <keras.src.engine.sequential.Sequential object... | <keras.src.callbacks.History object at 0x7dfce... |
| 8 | 2 | [128, 128, 64] | [relu, relu] | 100 | 16 | SGD-Mom | [1e-05, 0.9, 0] | he_uniform | - | 0.492929 | 0.523605 | 0.776407 | 0.666667 | 203.16 | <keras.src.engine.sequential.Sequential object... | <keras.src.callbacks.History object at 0x7dfce... |
| 9 | 2 | [128, 128, 64] | [relu, relu] | 100 | 16 | RMS | [1e-05, 0.9, 0] | he_uniform | - | 0.181092 | 0.746226 | 0.931823 | 0.593047 | 203.36 | <keras.src.engine.sequential.Sequential object... | <keras.src.callbacks.History object at 0x7dfce... |
| 10 | 2 | [128, 128, 64] | [relu, relu] | 100 | 16 | Adam | [1e-05, 0.0, 0] | he_uniform | - | 0.377406 | 0.477113 | 0.837407 | 0.635992 | 165.31 | <keras.src.engine.sequential.Sequential object... | <keras.src.callbacks.History object at 0x7dfcd... |
| 11 | 2 | [128, 128, 64] | [relu, relu] | 100 | 16 | Adam | [1e-05, 0.0, 0.2] | he_uniform | - | 0.548106 | 0.489455 | 0.720565 | 0.707566 | 174.82 | <keras.src.engine.sequential.Sequential object... | <keras.src.callbacks.History object at 0x7dfcd... |
| 12 | 2 | [128, 128, 64] | [relu, relu] | 100 | 16 | Adam | [1e-05, 0.0, 0.3] | he_uniform | - | 0.598142 | 0.519198 | 0.689392 | 0.728016 | 174.72 | <keras.src.engine.sequential.Sequential object... | <keras.src.callbacks.History object at 0x7dfcd... |
| 13 | 2 | [128, 128, 64] | [relu, relu] | 100 | 16 | SGD-Mom | [1e-05, 0.0, 0.2] | he_uniform | - | 0.758918 | 0.650358 | 0.598116 | 0.570552 | 203.18 | <keras.src.engine.sequential.Sequential object... | <keras.src.callbacks.History object at 0x7dfcd... |
| 14 | 2 | [128, 128, 64] | [relu, relu] | 100 | 16 | SGD-Mom | [1e-05, 0.9, 0.2] | he_uniform | - | 0.599020 | 0.568359 | 0.688047 | 0.723926 | 165.77 | <keras.src.engine.sequential.Sequential object... | <keras.src.callbacks.History object at 0x7dfcd... |
| 15 | 2 | [128, 128, 64] | [relu, relu] | 100 | 16 | SGD-Mom | [1e-05, 0.75, 0.3] | he_uniform | - | 0.683199 | 0.596861 | 0.630186 | 0.672802 | 161.06 | <keras.src.engine.sequential.Sequential object... | <keras.src.callbacks.History object at 0x7dfcd... |
| 16 | 2 | [128, 128, 64] | [relu, relu] | 100 | 16 | SGD-Mom | [1e-05, 0.9, 0.2] | he_uniform | - | 0.599020 | 0.568359 | 0.688047 | 0.723926 | 204.87 | <keras.src.engine.sequential.Sequential object... | <keras.src.callbacks.History object at 0x7dfcd... |
| 17 | 2 | [128, 128, 64] | [relu, relu] | 100 | 16 | SGD-Mom | [1e-05, 0.75, 0.3] | he_uniform | - | 0.683199 | 0.596861 | 0.630186 | 0.672802 | 205.10 | <keras.src.engine.sequential.Sequential object... | <keras.src.callbacks.History object at 0x7dfcd... |
print('The recall scores from various models are listed below')
printlist = ['Adagrad', 'SGD', 'SGD with Momentum' ,'RMSProp', 'Adam with no dropout', 'Adam with 0.2 dropout', 'Adam with 0.3 dropout',
'Smote - SGD', 'Smote - SGD with Momentum','Smote - RMSProp', 'Smote - Adam with no dropout', 'Smote Adam with 0.2 dropout', 'Smote Adam with 0.3 dropout',
'Smote SGD with 0.2 dropout', 'Smote SGD with Momentum with 0.2 dropout', 'Smote SGD with Momentum with 0.3 dropout']
output =pd.DataFrame (columns=['Model', 'Training Recall Score', 'Validation Recall Score'])
for inc, model in enumerate(printlist):
output.loc[inc] = [model, results.iloc[inc]['train recall'], results.iloc[inc]['validation recall']]
output
The recall scores from various models are listed below
| Model | Training Recall Score | Validation Recall Score | |
|---|---|---|---|
| 0 | Adagrad | 0.450482 | 0.433538 |
| 1 | SGD | 0.554776 | 0.468303 |
| 2 | SGD with Momentum | 0.626643 | 0.456033 |
| 3 | RMSProp | 0.816827 | 0.482618 |
| 4 | Adam with no dropout | 0.792287 | 0.462168 |
| 5 | Adam with 0.2 dropout | 0.441718 | 0.480573 |
| 6 | Adam with 0.3 dropout | 0.373357 | 0.437628 |
| 7 | Smote - SGD | 0.872169 | 0.642127 |
| 8 | Smote - SGD with Momentum | 0.776407 | 0.666667 |
| 9 | Smote - RMSProp | 0.931823 | 0.593047 |
| 10 | Smote - Adam with no dropout | 0.837407 | 0.635992 |
| 11 | Smote Adam with 0.2 dropout | 0.720565 | 0.707566 |
| 12 | Smote Adam with 0.3 dropout | 0.689392 | 0.728016 |
| 13 | Smote SGD with 0.2 dropout | 0.598116 | 0.570552 |
| 14 | Smote SGD with Momentum with 0.2 dropout | 0.688047 | 0.723926 |
| 15 | Smote SGD with Momentum with 0.3 dropout | 0.630186 | 0.672802 |
The Recall scores are better for these models,
#confusion matrix method
def make_confusion_matrix(titlestr,ytrain,predictTrain,labels=[1, 0]):
cm=metrics.confusion_matrix( ytrain, predictTrain, labels=[0, 1])
df_cm = pd.DataFrame(cm, index = [i for i in ["Actual - No","Actual - Yes"]],
columns = [i for i in ['Predicted - No','Predicted - Yes']])
group_counts = ["{0:0.0f}".format(value) for value in
cm.flatten()]
group_percentages = ["{0:.2%}".format(value) for value in
cm.flatten()/np.sum(cm)]
labels = [f"{v1}\n{v2}" for v1, v2 in
zip(group_counts,group_percentages)]
labels = np.asarray(labels).reshape(2,2)
plt.figure(figsize = (5,4))
plt.title(titlestr)
sns.heatmap(df_cm, annot=labels,fmt='')
plt.ylabel('True label')
plt.xlabel('Predicted label')
#Oversampled data with Adam optimizer with a dropput rate 0.2
model_1 = results.iloc[11]['model']
# Predicting using the model and Converting the sigmoid output to binary
y1_test_predicted = model_1.predict(X_test)
y1_test_predicted = y1_test_predicted > 0.5
print ('model_1', model_1)
from sklearn import metrics
cr1= metrics.classification_report(y_test,y1_test_predicted)
print(cr1)
make_confusion_matrix('Confusion_Matrix_Testing',y_test, y1_test_predicted)
63/63 [==============================] - 0s 2ms/step
model_1 <keras.src.engine.sequential.Sequential object at 0x7dfcee7a73a0>
precision recall f1-score support
0 0.91 0.77 0.84 1593
1 0.44 0.71 0.54 407
accuracy 0.76 2000
macro avg 0.68 0.74 0.69 2000
weighted avg 0.82 0.76 0.78 2000
#Oversampled data with Adam optimizer with a dropput rate 0.3
model_2 = results.iloc[12]['model']
# Predicting using the model and Converting the sigmoid output to binary
y2_test_predicted = model_2.predict(X_test)
y2_test_predicted = y2_test_predicted > 0.5
print ('model_2', model_2)
from sklearn import metrics
cr2= metrics.classification_report(y_test,y2_test_predicted)
print(cr2)
make_confusion_matrix('Confusion_Matrix_Testing',y_test, y2_test_predicted)
63/63 [==============================] - 0s 2ms/step
model_2 <keras.src.engine.sequential.Sequential object at 0x7dfcdf1da470>
precision recall f1-score support
0 0.90 0.75 0.82 1593
1 0.41 0.68 0.51 407
accuracy 0.73 2000
macro avg 0.65 0.71 0.66 2000
weighted avg 0.80 0.73 0.75 2000
#Smote SGD with Momentum with 0.2 dropout
model_3 = results.iloc[14]['model']
# Predicting using the model and Converting the sigmoid output to binary
y3_test_predicted = model_3.predict(X_test) > 0.5
print ('model_3', model_3)
from sklearn import metrics
cr3= metrics.classification_report(y_test,y3_test_predicted)
print(cr3)
make_confusion_matrix('Confusion_Matrix_Testing',y_test, y3_test_predicted)
63/63 [==============================] - 0s 2ms/step
model_3 <keras.src.engine.sequential.Sequential object at 0x7dfcde2c6cb0>
precision recall f1-score support
0 0.90 0.69 0.78 1593
1 0.37 0.70 0.48 407
accuracy 0.69 2000
macro avg 0.63 0.70 0.63 2000
weighted avg 0.79 0.69 0.72 2000
Overall, the 10000 records provided in the data set wasnt balanced in terms of customers churn out. Only 20% of the records had customers who actually churned out, so it was an imbalanced.
As the number of records were very low, I have used three hidden layer architecture with relu as activation function. Since the output is a binary classification, used stigmoid function. For performance improvement testing , introduced 2 dropout layers and tested the optimizers with 0.2 and 0.3 dropout ratios.
With default parameters of optimizer, Neural network training on the data provided poor recall scores. Hence the data had to be oversampled to improve the Neural netwok's ability to predict the customer behaviour. SMOTE technique was used to oversample the data and it yieled better results than using data as is.
In this case, Recall was the metrics to be used, as finding the potential customer who would leave is very critical. All the best fit models from the data was able to predict approx 70% of the customer behaviours correctly. Approx 7% of the customers who would leave were not identified correctly across all models. 'Oversampled data with Adam optimizer with a dropput rate 0.2' was the best fit model out of all the models here.
Below are the actionable insights and recommendations based on the models.
Actionable insights:
Recommendations:
Power Ahead